Category Archives: Fixed disks

Netgear Stora upgrade v3: 2-disk-JBOD to 1-disk-JBOD

So, we’re butting heads up against the storage capacity of our Netgear Stora again (93% full). The NAS currently has 2 x 2TB drives and no more free bays to drop drives into, so whatever the next arrangement is it has to involve getting rid of at least one of the current drives. The Stora is currently backed up to an external drive enclosure with a 4TB drive mounted in it. Other things are also backed up on that external drive, so it’s more pressed for space than the Stora.

So here’s the plan:

  • collect underpants
    This was a flippant comment, but it’s upgrade season and we recently acquired a computer second hand, which had an i5-3470S CPU, the most powerful thing in the house by a significant margin. I wanted the dual Display Port outputs, but unfortunately it could only be upgraded to 8GB of RAM, so instead the CPU got swapped into our primary desktop (and a graphics card acquired to run dual digital displays). Dropping in a replacement CPU required replacing the thermal grease, and that meant a rag to wipe off the old grease, thus the underpants.
  • backup the Stora to the 4TB drive
  • acquire a cheap 8TB disk because this is for backing up, not primary storage
  • clone the 4TB drive onto it using Clonezilla
  • expand the cloned 4TB partition to the full 8TB of drive space
    Well, that didn’t work.  Clonezilla didn’t seem to copy the data reliably, but admittedly I was running a stupidly old version.  Several hours of mucking around with SATA connectors and Ubuntu NTFS drivers later, I gave up and copied the disk using Windows.  It took several days, even using USB3 HDD enclosures, which is why I spent so much time mucking around trying to avoid it.
  • backup the Stora to the 8TB drive
  • remove the 2 x 2TB drives from the Stora
  • insert the 4TB drive into the Stora
  • allow the Stora to format the 4TB drive
  • pull the 4TB drive
  • mount the 4TB and 2 x 2TB drives in a not-otherwise-busy machine
  • copy the data from the 2 x 2TB drives onto the 4TB drive
  • reinsert the 4TB drive into the Stora
  • profit!

And, by profit, I mean cascade the 2TB drives into desktop machines that have 90% full 1TB drives… further rounds of disk duplication ensue. 1TB drives then cascade to other desktop machines, further rounds of disk duplication ensue.

At the end of this process, the entire fleet will have been upgraded. But the original problem of butting heads against the Stora will not have been addressed; this will hopefully a simple matter of dropping another drive in.

The last time we did this, we paid $49.50/TB for storage.  This time around, it was $44.35; a 10% drop in storage prices isn’t anything to write home about in a four-and-a-half year window.

Clone to a bigger drive, and convert MBR to GPT

I wanted to partly upgrade Windows to a new drive.

Currently, Windows itself and Program Files are on C: drive, which is an SSD (which I meant to blog about in detail, but never got around to) and documents are on D: drive (which was the tricky bit of the SSD upgrade — to do it properly involves using SysPrep with an Unattend.xml configuration file that tells Windows that documents will live on D: not C:. This article describes it in detail.

Anyway that’s really irrelevant to the problem at hand, which is that D: drive had run out of space. Here’s a brief description of what I did:

  • The new drive is a 4 Tb drive, replacing a 1 Tb drive.
  • Plug the new drive in, use Clonezilla to clone the old D: onto the new drive. Following the detailed instructions, this all went pretty smoothly.
  • But… the catch is the old drive was formatted in MBR, which has a limitation of 2 Tb. For beyond that, you need GPT.
  • I looked around for tools to convert the drive. It’s easy if you’re prepared to wipe it, but I wanted to preserve the data I’d just moved across. Finding ways to do it without wiping everything was tricky, but I settled on the free version of Minitools Partition Wizard — this has an easy-to-understand interface, and did the job
  • Once that MBR is converted to GPT, you can enlarge the partition to make the whole drive available.
  • Unplug the old drive, move the new one into the same slot as the old (this is on a Mac Pro booting in Windows Bootcamp) and it works. Done!

PS. Similar exercise afterwards shuffling the OS X partition from a 320 Gb drive to the old 1 Tb. That required GParted, as it seems the GPT partition couldn’t be expanded due to a formatting issue (which GParted helpfully offered to fix as it started up) and another small 600 Mb partition being in the way — not sure what it is, but it seems to be essential for booting OS X — GParted was able to move it to the end of the disk.

Low spec notebooks can’t handle large amounts of RAM

Cathy and I are seeing increasing contention for the grunty computer in the house not dedicated to playing computer games. It’s used for a combination of recreational programming, web surfing and media encoding tasks. We decided to acquire a second, and after comparing the costs decided that the premium for laptop portability wasn’t too great (about $100; in fact that seems to be about the price of the OS we were forced to buy with the hardware). In out usage profile, “grunty” isn’t defined by CPU, but responsiveness which really comes down to how often an arm has to venture out across a spinning sheet of rust. Unfortunately, bottom-end systems (i3 class CPUs) can’t handle our base-level RAM requirement of 16Gb, so yet again a portable computer is the most powerful thing in the house – the new system’s specs are:

Processor: AMD Quad-Core Processor A6-5200 (2.0GHz, 2MB L2 Cache)

25W of power consumption right there. Existing grunty computer pegs its CPU for about ten hours a year, in sustained encoding runs. We weren’t CPU bound, and yet the only way to get that RAM in an i3 lappy was to spend an extra $100 on a Toshiba with worse specs – so we got a quad core.

Memory: 4GB DDR3 1600MHz (max support 16GB)

That 4GB came straight out and was replaced by the most RAM that could be stuffed in there. Existing grunty machine had 8Gb and was paging a lot. Why are web browsers so memory hungry? This upgrade cost $160.

Storage: 500GB (5400RPM) Hard Drive

This came straight out before the machine was even powered up once. It was replaced by a Plextor M5-Pro 128GB SSD; this unit was selected for its fast random write speed, and the common-for-all-SSDs 0.1ms seek time. Back in the day (about ten years ago) I advocated that when building a machine, you should get drives with the fastest seek times and screw everything else, plus all the RAM you could afford – to use as disk cache. How little things change. This upgrade cost $129.

After Linux Mint 12.04 Maya (LTS) was installed (consuming 6Gb) there was 110Gb free on the replacement device. Paging has been disabled due to the SSD write limitations, and tmpfs is used for various directories to further minimise our impact on the longevity of the drive.

Graphics Card: Onboard (Integrated)

The contention for the memory bus is troubling, but at least there’s no extra juice being sucked down to power a fancy-pants GPU. This is not a gaming machine, 2D acceleration is useful, 3D not.

Operating System: Windows 8 64 Bit

That went with the rotating media. We’re going to see if we can boot a desktop machine off of it and still have the OS believe everything is okay. The laptop didn’t like the new OS, saying “Selected boot image did not Authenticate. Press Enter to Continue”, but the solution was to disable Secure Boot.

Screen: 15.6-inch diagonal HD BrightView LED-backlit Display (1366×768)

It took some fiddling for Cathy to figure out how to dim the damn thing under Mint. Turned out the answer was to install the proprietary AMD drivers.

Audio: Dual Speakers Stereo DTS Sound+

If you’re using a laptop for A/V reproduction, you’re doing it wrong.

Connectivity: Gigabit LAN (RJ-45 connector), 802.11b/g/n WLAN, Bluetooth

The Toshiba only had 100Mb, in this day and age! The Ralink wireless adapator wasn’t picked up automatically by the installer, so Cathy got down and followed the instructions off AskUbuntu

Built-In Devices: 1x USB 2.0, 2x USB 3.0, HDMI, RJ45 Ethernet, Headphone-out/microphone-in combo jack, SD/SDHC/SDxC Card reader

USB3 was important in picking the unit, as I’ve seem just how much faster it is. HDMI is necessary for twin-monitor development; MSY had a 21.5″ Full HD IPS on sale for $118.

Webcam: HP TrueVision HD Webcam with integrated dual array digital microphone

I’d just paint over it, but there’s a chance that we’ll have a use for videoconferencing. It stays, but it better mind it’s Ps and Qs or else it’s black electrical tape for it.

Optical Drive: DVD Burner

Yeah, like that’s ever getting used.

Weight: 2.33 Kg

I’m more used to computers that weigh 1Kg, not two and a half.

Dimensions: 56cm (L) x 13cm (W) x 34.5cm (D)

This thing has a widescreen display, it’s freaky big compared by my 10” netbook.

Other observations: the keyboard sucks balls, with the trackpad positioned such that you physically can’t touch-type on it because doing so places your palms on the trackpad, moving the mouse and screwing up your input (I think this is happening because gestures have been turned on; they might find themselves getting turned off again). For some messed up reason they’ve included a numeric keypad, so touch-typing is doubly hard – again with the palms. This thing’s going to find itself plugged into a USB hub with a real keyboard and mouse quite a lot I think.

Anyways, the HP Pavilion 15-E001AU was purchased from MLN for the low, low price of $500. Total system cost was $907, and at the end we had a 4GB lappy stick and a 500GB lappy drive laying around.

Upgrading Netgear Stora without data loss

Despite my expectations, I’ve managed to upgrade our NAS’s storage quickly, easily, and without losing a byte of data.

We have a Netgear Stora as our home NAS. We’ve been butting heads against the storage limit of the box, but I’ve always been careful not to populate the second drive bay; the last upgrade replaced the single 1Tb drive with a single 2TB drive – 2TB was the cost/storage sweetspot. However, a couple of years on and it’s still the sweetspot, the largest drive capacity is only 3TB (I suspect due to the Thailand floods of 2011 – we’ve been stalled at this capacity for a while… which is a little misleading, but I’m not paying $550 for a 4TB drive when I can have 3TB for $150) and it seemed like it was time to exploit the second drive bay.

Researching online shows that the default configuration for a Stora is RAID 1, which is… not the default I’d have chosen. What we want is a JBOD array. I didn’t recall changing the configuration the last time we did an upgrade, so it’s a safe bet that we were still a RAID 1 setup. The documentation is clear that converting from RAID 1 to JBOD or vice versa requires a format of the media, so step 1 was to ensure our backup of the backup was up-to-date; that took overnight to complete, even with the 2TB USB3 drive that we picked up for only $99 from Officeworks (how are they able to sell a drive and enclosure for the same price as a cut-price parts supplier sells the naked drive?)

If anyone can explain why I was getting over 70MB/s to my external USB3 hard drive when I started, and a few hours later when I went to bed I was getting under 30MB/s, I’d love to hear it. It was a steady decline in I/O rate and I’m at a loss to explain it.

Anyway, with the backup completed, and verified, it was time to bite the bullet. For step 2 I powered down the NAS, extracted the existing drive from the NAS and put it aside, took our lying-around-spare 2TB drive and shoved that in its place and then restored power. I fired up the (Windows-based) Stora management software and connected to the Stora and it announced that there was some weird drive mounted, and what storage configuration did I want? Having picked JBOD, it then proceeded to format the drive.

Once the formatting was done, I proceeded to step 3. I powered the Stora down, inserted the original drive in to the previously unused bay (the vertical orientation flipped relative to the other bay, which was surprising) and restored power. I fired up the (Windows-based) Stora management software and connected to the Stora and it announced that there was (again) some weird drive mounted, and what storage configuration did I want? Annoyed that it didn’t remember that I’d already picked JBOD, it then proceeded to format the drive, both as expected (per advice on the Internet) and as it had last time. There was a slow moving progress bar and everything.

Once that was all done, I got ready for step 4: restore the backup. I browsed to the mount, and discovered all the data was already there. Every last byte. The lying bastard of a thing had formatted nothing. The carefully prepared backup was not needed; I spent several long moments stunned, absolutely stunned.  I even ran a few checks to make sure I wasn’t being lied to, that the OS had cunningly cached the directory structure. But it was true; I could play media, read configuration files, the works. Free space was now reported as 2.2TB. I’d suspected there was a chance that this would work (JBOD shouldn’t require any special formatting, unlike RAID 0 and perhaps RAID 1), but still couldn’t believe it.

A technology upgrade worked, and contrary to advertised capabilities. Has this ever happened before?

NAS on an old PC

So having now got my new PC set up, the old one is to be converted into a NAS server to hold all the kids’ home videos, photos and all our other files that need sharing. (No, I’m not in the fiscal mood to go and buy a Microsoft Home Server.)

There seem to be a few different free or cheap options.

Requirements? Well the machine I’m looking at is an old P3-500 with 192Mb RAM (though I may be able to scrounge some more for it). RAID 1 would be nice (though I’m going to have to check my drive sizes), for some redundancy. Web interface definitely good. User-level permissions so the kids can’t accidentally delete my files.

FreeNAS appears to be very lightweight, and run well on old hardware. This review gives it a good rap, though the people who have commented on it give it mixed reviews. It’s still a beta product (quite possibly a perpetual beta), which makes me somewhat wary to entrusting my files to it. I tried it out. Despite its beta status and small volunteer dev team, it seems quite polished. Boots up fast, and bleeps when it’s ready, which is handy if sitting out of the way, not plugged into a monitor. Haven’t actually created any shares etc yet, but will keep playing.

OpenFiler seems to need more resources (256Mb, they say) and seems to boot a bit slower — about 90 seconds in my test on the old box — but appears to be more mature and fully-featured, with commercial support options behind it.

There are other cheap (but not free) options such as NASlite, though it’s a little hard to judge if it would work on my old hardware, and there doesn’t seem to be an evaluation copy available.

ClarkConnect comes in a freebie (community) version, and also has a whole bunch of other groupware features, as well as things like firewall and print services. Probably a bit over-the-top for my needs, though you can choose just to install the stuff you need. Print services might actually be quite handy.

And of course I could chuck on any Linux distro I chose and configure it with Samba (as per the APC Mag article last month that got me thinking about this). Which would mean I had a full Linux installation to muck about with when I wanted, and/or host dev tools like Apache and MySql on. Though it sounds too much like hard work to me, and it might be a strain on the old box.

What other options should I look at? Or should I just stick with FreeNAS, which appears to be the most promising so far?

One interesting comment I read: that for home use, old PCs (that may not have been designed with modern power conservation in mind) may suck down a lot more power than the average dedicated NAS device. Good point, hence I want a quick boot time so the box doesn’t need to be left running permanently.

Handy link: How to: Install FreeNAS

MODM, NAS, APC and other acronyms

I’m sorry Cam. I was intending to go to tonight’s MODM (Melbourne’s Online Digital Media) event at Fed Square, but after a bugger of a day at work (that started before I even left the house, and got steadily more frenetic) frankly, on a cold night like this, I just wanted to get home to my warm house and a bowl of soup. Hope it went well though.

In my spare moments today, I’d been eyeing off today’s Zazz offer — a basic desktop machine for A$453 (inc shipping). Basic it may be, but it’s actually got more grunt than my secondary desktop machine, which is getting old and is far from dazzling in its speed, and sometimes frustrating compared to the faster PC. (Also its USB ports don’t work, and I haven’t got the energy/expertise to figure out why.)

I was finding that tempting enough, then I found myself reading this month’s APC on the train home, an article about setting up a NAS on an old PC. Ooh. Now there’s an idea. Glenn isn’t the only Geekranter who’s been looking at options for this — it’s been something I’ve been thinking about for some time now. (I did try leaving files on the MG35, but it’s not ideal, and it’s very slow via Ethernet.)

So I’ve ordered the Zazz deal for a new secondary desktop, and while I wait for that, I’ll try and figure out how to swap the Windows XP licence off the old PC and onto the new one (and Ubuntu onto the old, to run the NAS as per APC’s suggestion — though NAS-specific OSs such as FreeNAS also look like a good option).

Cheap and cheerful disk benchmarking

Freebie disk benchmarking: Disk Bench. Does quick tests by reading/writing/copying files of your preferred size, and tells you the speed. The only downside is it requires the Dot Net Framework. (Explains why the Disk Bench download is so small.)

I found this while pondering why my secondary computer is running so slowly. Confirmed my suspicions: Windows is installed on the slowest of the old drives in the beast. Time for a quick re-install.

XP defrag

I’m not overly impressed with the Defrag utility in Windows XP. In my eternal quest to try and speed up my mysteriously slow work machine, I decided to give it a go. Cleaned up a bunch of files first, to give the C: drive 6.5Gb free (out of 29.3Gb). Analyze: said I should defrag. OK, so I left it running over night…

Came in the next morning. The little colour graph showing where files are didn’t look terribly different from how it was left. Still lots and lots of red (fragmented files). It said it couldn’t defrag some files… basically anything over 15Mb.

Defrag

Out of curiousity, I clicked Analyze again. “You should defrag this volume.” What, again? What’s the point?!

I did some more purging and eventually ended up with about 10Gb free. Tried it again. Better, but it still couldn’t/wouldn’t move anything bigger than about 30Mb. Weird.

At least the machine seems to have sped up a tad now.

Briefs

One more reason Lego rocks: they don’t mind if people hack their stuff.

Need to wipe, kersplat, zap, nuke, delete, a hard disk, but don’t want to have to physically pull it out of the machine and jump on it, drown it, then take a hammer to it? Like, if you want someone else to be able to use it? Try Darik’s Boot and Nuke. (via Colin)

With hot rumours of the Australian iTunes shop being about to launch, this guide to DRM covers how various online stores restrict what you can do with the music you buy.

Backup, backup, backup

I was sick at home for a couple of days last week, and while pottering about the house blowing my nose, found some old floppy disks. I decided to move all their data onto CD, and in the process found some old articles I wrote in 1997 for an abortive gig as a columnist for a US-based magazine. Some of them are still relevant, so I’ll re-post them here every Monday for the next few weeks.


In the computing world, stories abound of people losing large chunks of work. This never used to happen, because people used to use far more reliable, but arguably less productive, methods of working. Like paper. Okay, so if all your work was on paper, you could lose large chunks of it, but this tended to be because of something disasterous – an enormous fire, perhaps – and in that situation, life and limb is going to be the first priority, not your work.

Modern technology however, has brought with it a multitude new and exciting ways of losing all your work. Hard disks can crash, or develop errors. They can be accidentally formatted. Your files can be moved, deleted, corrupted, overwritten. This is why you need to take very good care of your files. Back them up regularly, or the day may come when your work is lost and you don’t have any way of recovering it.

A few years ago, I was working writing software for a big company. My colleagues and I had performed a true miracle of coding, and had delivered a piece of software that would change for the better the lives of hundreds of people working in that particular bit of the company. Okay, so it wasn’t going to solve third world hunger or bring world peace, but we were very pleased with it.

One Friday afternoon, I was looking on our shared network drive at the files that made up our masterpiece, when I noticed something odd. Some of the files and directories that I expected to be there, weren’t. I looked again. More were missing. They were disappearing before my very eyes.

I, not to put too finer point on it, panicked. I sent a system broadcast message asking anybody who might be listening “Why are the files on N: disappearing?” I looked again. The files stopped disappearing, but most were already gone. The phone rang. I answered it.

“Uh oh”, said the quavering voice of the LAN Administrator on the other end of the phone. He had been given the task of clearing up one of the file servers. He had used a utility’s PRUNE command to do it. A flawless plan. Just one small snag. Wrong server.

No problem, right? Go to the backups, right? Wrong. It just so happens that the LAN people at this place had been a little lax in the backups department. For about 3 months. Yes, THREE months. It was when we realised this that we decided to call this day “Black Friday”, and we spent most of the rest of the afternoon moping around the office looking miserable. You can bet that if there had been supplies of alcohol available, they would have been consumed quite rapidly.

As it happens, there was a consolation. I had copied many of our more important files onto my hard drive, a mere three weeks before Black Friday, “just in case”. Three weeks’ work lost wasn’t exactly a cause for celebration, but it was better than three months’.

I didn’t feel vengeance towards Mr Pruner. Mistakes happen. What wasn’t forgivable, in my book, was the conduct of his boss, whose responsibility it was to ensure that the backups happened, so that when mistakes like that happen, the files are recoverable. It’s just as well that he’s substantially bigger than me, otherwise murder might have been committed that day.

The moral of the story is this: Make sure your files are backed up. Frequently. Double-check that it’s actually being done. Triple check, even. If someone else does it, make a spare set yourself occasionally. If you don’t, then make sure there’s plenty of alcohol in the office fridge. Because when Black Friday hits you, it might be the only help available.