Backup Updated

February 17, 2018 at 4:53 pm | Posted in Backup, Computers, Software | Leave a comment

I’ve written several articles on computer backup over the years. For example:

Backup That Counts reviewed types and locations of backups

Backup That Works talks about why you want a backup, and why you want to choose the system carefully.

However, many of the programs I’ve recommended have not been upgraded for more recent Windows versions. It’s time for a new article.

For my work computer, I have 3 kinds of backup:
– weekly system image of the boot drive
– daily data backup of changed files – a differential with monthly full, 3 months kept.
backup on save (real time or file sync) of key files or folders from my current projects, with versions.

The first ensures I can recover my system and get up and running quickly. Standard backups don’t work well for restoring operating systems – you want imaging for that. The second ensures I have copies and quick access to everything I’ve worked on or collected. The third makes a copy each time I save the file, ensuring I lose little time if key files are corrupted, deleted in error, or similar. I can get working quickly even if the whole system died.

While I’ve not needed these backups often, they make a huge difference when I do. There are some files that could not be recovered.

While Windows 10 has a backup tool included, I don’t like the approach. It’s hard to check if you don’t know what it’s doing. Other tools I’ve used have become out-of-date or insufficient.

I found maintaining 3 different programs annoying so I ended up migrating to AOMEI Backupper to get the first 2 types of backup. This includes a tool to create a bootable disk so you can access your backup without the software installed, like in moving to a new system. Then after testing, I upgraded to the paid version to get the third, File Sync. The paid version also has some useful tools I wanted. AOMEI has served me well for awhile now.

I’ve not found other tools that combine these techniques. AOMEI isn’t perfect. It’s not always clear what some settings mean. But their web site covers the process for configuring most options. Most importantly, the software has been very reliable. Once it’s set up and scheduled, it’s simply worked.

Book Publishing – Part 1 of 2

August 28, 2017 at 11:04 pm | Posted in Backup, Books, Design, Online services, Software, Writing | 6 Comments
Tags: , , , , , ,

Many people have thought about writing a book. A small percent of those ever start. An even smaller percent get it written and a still smaller group try to get it published.

Nowadays, the majority of books are self-published. The average book sells fewer than 100 copies. Most published authors also have ideas that never see the page, half-finished works, and works that never went to publication.

Clearly, writing a book requires determination and passion. Self-publishing adds quite a few other hurdles to the equation. Writing turns out to be just the first step. Getting it out there requires many more steps. You can pay to get professional help for almost all of it (called a vanity press) but is that cost-effective for the market you have?

Some steps require help. But many steps can accomplished with a little learning and free or low-cost resources.

I’ve recently been through this process myself. I’ve attended several publishing workshops and writers groups, heard many presentations by people in the industry, and have been researching the software and documenting my process. Other authors have found the tips valuable, so I thought it would be useful to share some of what I’ve learned.

The first thing to understand is that desktop publishing revolutionized book publishing too. The changes are still coming. How-to-publish books from two years ago are no longer current. You can sell your self-published book internationally through dozens of outlets and even get into the catalogs of traditional distributors for libraries and bookstores.

But to get any real uptake requires you create a professional product. While it’s possible to draft a book in Word, upload that into Amazon as an ebook and offer it to the world, the likelihood of that going anywhere is tiny. That’s like putting a lemonade stand on the street and expecting the money to roll in. You’re competing with thousands of others around the world.

Further, if your audience happens to find your book but cringes at the cover or opening pages, that’ll kill sales, lose you money on bookstore returns, and get bad reviews. Unprofessional work lowers the whole market.

Following is a list of some of the stages of a book project. Each requires different skills and often, different tools. Below, I’ll go into each section and suggest tools and tips that may work for you. This article assumes you’ll be producing print and ebook versions of your book to reach the largest number of international readers. Ebooks alone are easier to prepare but you can’t use that for print. You can down-sample your print design into an ebook though. We’ll design for print, then output print and digital editions.

Note that this is an overview. Many of these topics have entire professions and websites dedicated to them. I’ve added numerous links to more information. The software I suggest is Windows-based, although some of it is available for other platforms.

Interior Design
Cover Design

Part 2:
File Conversion
Web Design


This is a professional project so you need an appropriate place to write, a decent chair, uninterrupted time, and so on.

You’ll want to set up a folder structure on your computer to store your book files in. Just like a filing cabinet. It can be a folder on your desktop but you want to take special care of these files as they’ll contain many hours of work.

If you haven’t already, you also want a backup system. I’ve seen authors loose their entire book in one hiccup. Have an automated backup. Make copies of different versions if you make major changes, like prior to editing. The ideal for creatives is a backup-on-save tool like File Hamster (free after the trial but needs .Net2) or Aomei Backupper Pro. The later has Real-time Sync in the paid version, along with system and data backup tools from the free version.

Its also a solo profession so you’ll find connecting with other writers and sharing tips valuable. Most areas have local writers groups. Just beware of groups where no one is producing work.

There are also on-line groups and sites you may find valuable. Just remember this is networking time, not work.

Finally, if you’re putting in a lot of hours, here’s a site of wellness tips for writers.


Your primary tool for writing is typically a word processor. Many people just use what they have but there are excellent free alternatives that will work with standard formats, including OpenOffice and LibreOffice. LibreOffice is a branch of OpenOffice that has been further along in development. The interface looks much like Word before the ribbon – many prefer that. Both support open standards.

You also have other choices. Inexpensive tools like Scrivener support the overall writing process. Others use clipping tools like Evernote to gather material. Recent versions of Windows include OneNote or it can be installed free. I paste notes into searchable text files

I’ve been writing on-line for a long time so I migrated to using Notepad++. It’s a text editor with spell check. It keeps me focused on the writing and doesn’t add unnecessary code. I’ve used it for long-form writing as well, migrating to LibreOffice when it’s time for formatting and sharing with the editor.

It’s usually best to stay with the flow of writing and leave the editing for later. Get the ideas down, then organize them. Avoid the temptation to format too soon. Lots more polishing is needed before you make it look pretty.

Once you get the content on the page and into some kind of structure, then you can go back over the words and begin smoothing.

Most authors benefit from a little planning, like an outline and structure so they can organize rough chapters and place the content. You may find a writing workshop valuable, perhaps one for your genre. Be forewarned that many publishing workshops are designed as sales fronts for vanity presses. They can still be valuable but only if it isn’t all about their sales channel – if they actually help you structure your book.

Ever notice how Google often finds the same articles on multiple blogs without credit? When you’re charging for a book, you copy at your peril. Plagiarism is easy to check. In fact, some editing tools include plagiarism checkers so you can insure you’re not wording things too much like another source.

Quoting is fine but give valid credit and use valid sources. There are a lot of badly attributed quotes out there, especially for people like Mandela and Einstein. If it doesn’t say where they said it, it’s not a valid source as it can’t be verified. Sites like WikiQuotes can help ensure you’re using legitimate ones.


Once you have a rough draft, you need to formalize the book structure more. Chapters, subsections, footnotes and so forth. Also roughly placing images and tables.

Here’s an article that talks about the front and back parts to plan, especially for non-fiction books.

If you’re not using a word-processor, it’s time to migrate your copy there. Just roughed in layout though – like bolding titles. Detailed formatting and design will be done in other tools after a lot more editing.


This is the step that requires professional help. It’s the step that will give your book a professional polish and readability. Even professional editors will hire another editor for their own writing.

However, before you head to an editor, you can save a great deal by first using one of the better editing tools like ProWritingAid. Then you’re not paying someone to fix your basic typos and glitches. ProWritingAid has a free on-line tool you can try but for a book-sized project, you’ll want more. They have several options including a Word plugin and a stand-along program. To give you a sense of how thorough it is, the tool has 25 reports. As you get to know it, you’ll find your writing has typical weaknesses best addressed with certain reports. For example, if you’re prone to over-use words or use clichés, use those reports. But if not, you might skip them.

One author wrote that she uses EditMinion, a free online tool, first. Then she uses ProWritingAid.

With that level of polish, you’re ready for a professional editor. Hopefully what you need is line and copy editing and not a rewrite. (there are many types of editing)

Your best source for an editor can be other authors recommendations. I’ve seen people without even an English degree let alone experience put up an editor shingle as a work-at-home project. Don’t shortcut. You can also solicit bids from sites like Reedsy.

Typically, you’ll send a sample and they’ll let you know how much work it needs. Then you’ll have an estimate of cost and time. The editors I’ve worked with requested Word docs, turned on Edit/ Track Changes, and marked up the files. You can then accept or reject their recommendations. Much easier than retyping although some of that will be called for too.

Be prepared for lots of changes. The object here is clear communication, not saving your little gems. A good editor fixes issues with clarity, grammar, and flow. They don’t change your voice or influence your story (unless it needs a reworking). If they do, look elsewhere. This is your book, not theirs.

If you’re making use of real-world or historical facts, this is a good time to verify your sources.

If you’re writing non-fiction, you also want to be building a Bibliography and references. Here’s an easy, free on-line citation generator for your Bibliography. (choose the style you want: Chicago, APA, etc) Just copy and paste them in alphabetically.

Once the whole thing is put together, it’s useful to have a few readers go over the text to make sure everything is clear to them. You want to be sure readers don’t get lost or stuck somewhere.

Then you run the entirety through a final proofreading aka a re-edit. Resist the urge to tweak the text after this stage as you can add new errors. Consider the content done.


Every published book and every format of that book (soft cover, hard cover, epub, kindle, pdf, etc.) requires its own ISBN number. It will be on your printed back cover, your Copyright page, and on the book sales web page.

While you can pay for ISBN’s when uploading through Amazon and other distributors, that will tie your book to them as the “publisher.” You may have to get a new ISBN for other outlets. This will split up your sales data and lower your books presence and thus sales.

A similar thing will happen if your book is later picked up by a publisher but in that case, you’d only migrate to the lower take of a publishing deal if there are expectations of higher sales. As a publisher would normally re-edit and design a new cover, it would be a new edition, anyway.

Your better bet in self-publishing is to create an “imprint.” Essentially you make up the name of a publishing entity that represents your books and ties into your “brand.” Then you order your own ISBN numbers under this. This becomes your “publishing company.” (some charge for this tidbit) Mine, for example, is Davidya Publishing. If there’s tax advantages, you can formalize the company later. In the US, the government farmed out the sale of ISBN’s through Bowker. In Canada, you can get ISBN’s from the government for free. For other countries just search “ISBN CountryName.” Each varies.

With your ISBN, you’re ready to design your book. You can start the book design before getting your ISBN but you’ll need it for the print cover.

Interior Design

Your first decision before you begin design is to choose a book size. Unless you have a great reason, I’d strongly recommend a standard size.

Most recommend you get a book designer to design your book professionally. Interior Design is the look of the inside of your book – the fonts, headings, icons, page numbering, spacing, gutter, and so forth. This may seem simple but a poorly designed book is harder to read and will turn people off. Your book is not a school essay but a product you’ll be offering for sale. Does it look like it’s a commercial product?

You can ask other authors for recommendations or get bids for a book designer at 99designs.

If you have design skills and you’re going to tackle your own design, take a look at how others have designed their books, especially in your genre. Even if you do plan to hire a pro, you may find reading this over will help you understand what you’ll need from them.

Several experienced authors strongly recommended Adobe InDesign, saying it was worth the cost and learning curve long term. I’m happy I took their advice. You don’t need the latest version but your printers are set up to work with InDesign output. Consider the cost relative to a designer over several books. You’ll also be using it for the Cover design, if you’re tackling that too. It has a learning curve, but that’s easier if you’ve used other Adobe products like PhotoShop or InDesign’s predecessor PageMaker. And there’s lots of on-line help.

A free alternative that runs on many platforms is Scribus. I understand there can be some problems with uploading its output to printers but that these can be fixed in Acrobat. But if you need Acrobat, why not just get InDesign?

In InDesign, create a file for each section and chapter (don’t skip this), copy the content from your polished work into the files, then assemble the files as a Book. For chapter file names, start them with numbers to help organize them and avoid spaces in the file names – this will cause a hassle later in ebook world.

Also recognize that a bound book has specific layout requirements. You want to start right.

Choose your fonts. Make sure you can use the fonts commercially. Some downloaded fonts don’t authorize commercial use, for example.

Remember your basic design principles:
– fonts and other design elements should be the same or different, not similar. Similar looks like a mistake.
– traditionally, body text is serif fonts while titles are sans serif.
– make sure the cover is legible. It won’t help you if the title is hard to read or can be misread.
– paragraphs can be indented or not. You don’t need double spaces after a period. (these are old typewriter rules)

Set up pagination. File/ Document Setup to adjust. Usually all chapters will be an even number of pages to ensure new chapters start on the right side.

Design one of the early chapters first as a design template, adding the styles for titles, sub-sections, quotes, paragraphs, footnotes, etc.. Then set this file as the default Style Source (left side of the Book list) and copy the styles out to the other chapters. You may also want to edit the default paragraph in InDesign, or replace it in each file. Then you just go through your text and apply the styles.

Images should be at least 300 dpi so they print clearly. Only use images you have the rights to and give credit in the book. Again, you’re selling the product so using others work without rights is theft. You don’t want your distributor to delete your book due to a complaint.

Here’s a few articles that go over setting up your book in InDesign. Once you get the basics working, it comes together quickly.

One weakness of InDesign is it does footnotes but not endnotes. If you want endnotes, set footnotes then convert them to end-of-chapter notes or end-of-book notes. I used these scripts. You can rerun scripts to update changes but it’s easiest to do this once when the content is stable.

You’re also adding the opening and closing sections like the title and copyright page, dedication, index and so on. (see link in Layout above) InDesign has a tool for creating an Index from words you mark. (see the Index panel) It will also create a Table of Contents (TOC) from the titles and sub-sections you’ve styled. You can also use the table of contents tool to create a list of illustrations or tables in a similar way. Style the related text appropriately and distinctly, then use that to structure your TOC.

You had to polish every bit of text over and over. Now you have to polish every bit of the design over and over. Random things that happened during writing and editing can create little layout bugs. Like having two line breaks instead of one hard return can create different spacing. There will be things that are hard to find in InDesign. A quick search on-line usually finds the solution.

When you output the ebook version later, it will strip some of this formatting for you, like page numbers. They’re of no use when the text reflows to the device screen size. But you must be fussy about this step for the print version.

For the final print version, you’ll want to be adding some custom spacing to ensure subtitles are not at the bottom of pages and so forth. But leave these edits out for now as you don’t want to mess up your ebook version.

Cover Design

This is the #2 place where professional help is most recommended. Your cover design will determine if someone even looks at your book. If it screams amateur, they’ll assume the content is too. (Yes, people judge a book by its cover.)

Sure, you can auto-generate a cover in Calibre (in Part 2) using your ebooks metadata but it looks the part. You can also create a cover in CreateSpace for Amazon. But again, generic parts make for a generic look.

As above, you can use 99Designs to find a cover designer or talk with fellow authors for recommendations. Some designers will do both interior and cover at a slightly reduced rate. If you’re doing an ebook as well, you’ll also want the digital front cover. 99Designs also has a deal for IngramSpark customers.

If you happen to have graphic design skills, you can study how professional book covers are designed in your genre, then use design software of your choice. But note that the output of that software is what you’ll be uploading to the printers. They’ll reject files that don’t meet professional standards. They don’t accept JPGs for print, for example. Again, InDesign is recommended.

Remember that the cover will be printed so the colours have to be in the CMYK gamut or your cover can look quite different printed than you expected.

Again, use at least 300 dpi images and only use images you have the rights to and give credit in the book.

In the distribution section, I’ll be recommending you upload directly to Amazon as it’s the largest bookstore in the world. And I’ll recommend you upload to Ingram to get in their catalog plus get distribution through the worlds other ebook stores. This covers most everyone else including libraries and bookstores.

To build your cover correctly, you need a template set to the right size – both the cover size and the spine. The spine is determined by the number of pages. Your print book cover will be printed as a “spread” of the front, spine and back so everything has to be exactly the right size.

You also have to build the cover with “bleed.” This means having extra image around all the edges so the cutting of the cover doesn’t leave any unprinted trim. Usually .125″ on all sides but your printer may vary this.

While there are formulas for calculating all this, it’s easier to download templates from the suppliers. Ingrams will include your ISBN barcode too. If you plan to sell your print book internationally, I’d recommend not including the price in the barcode. It will be set in the particular sales channel.

Again, the print cover will be CMYK and the ebook cover is RGB. But it’s easier to stick with one version until you get to the conversion stage.

Getting an IngramSpark template

CreateSpace (Amazon) template

This completes the Design phase of your books production.

In Part 2 on ForNow, we’ll convert the book to the final formats and prepare the book for uploading and distribution.


Running Android on a PC

March 13, 2014 at 8:52 pm | Posted in Backup, Computers, Economoney, Games, Hardware, Software, Technology | Leave a comment
Tags: , , , ,

Say you want to mess around with Android and apps, but you’re a little nervous about experimenting on that rather expensive phone or tablet. One solution is to load Android into a virtual environment where you can play around all you like and nothing is ever broken. All you do is back up your virtual machine (VM) software folder first, then if anything goes sideways you can restore it in about a minute. Developers use this approach all the time.

Oracle’s VirtualBox (VB) is free virtualization software you can install just about any Operating System (os) into (assuming you have a legal license). I’ve been running Windows XP, Ubuntu Linux and Android this way for several years. XP, mainly to support some old software that won’t run in current OS’s, the others to explore and experiment with. No messing with my main computer or setting up a boot loader. The other systems run in a window, so no rebooting required. File sharing is much like sharing over a network.

Fred Langa has written an article with step by step instructions for installing VirtualBox and Android on a PC. Most of the steps are pretty obvious but there are a few options that are not and a couple of gotchas. Note his comments about the captured mouse (for touch-screen behaviour), for example.

For Android, you need VT-x (AMD-V on an AMD processor) enabled in the PC’s BIOS. Most modern processors have it but it may be off by default. I checked a couple of utilities to confirm I had it but it was off anyway. Just reboot into your computers BIOS and turn it on. (instructions vary by maker) If you skip that step, the instructions will tell you Android is not supported, so do take care of that first.

I also noticed that some people who also use Microsoft Virtual Machines (like XP Mode) may find VT-x not working because Hyper-V is hogging it. In that case, it’s on in the BIOS but still unavailable in VB. In Windows 7+, the Hyper-V setting is hidden. Comment 5 on this thread offers the command line for turning it off and on. If you want to get fancier, I noticed this article on Hyper-V Manager. It still requires a reboot though.

Other Choices
If you just want to play an Android game on your PC, you might like Bluestacks. It’s designed for loading apps on a PC. It says it’s free only while in beta though.

Genymotion is an Android virtualization tool to create various OS version and screen size variations to test an app in. That’s for more advanced testing.

More OS’s
The advantage of using a tool like VirtualBox is you can also play around with other OS’s. You can get other images (VMs) here, for example. A popular Linux distro, Ubuntu is here. Install the current VB Extension Pack to support it.

The 13.04 version of Ubuntu is an OVA file. OVA files are preconfigured – just double-click to load into VirtualBox. Far fewer steps than in Fred’s article above. It also comes with LibreOffice and other software pre-installed. Note the password on the download page for your first Ubuntu login. You can go into System Settings (gear top right) and add a new User of your choice once logged in.

Rather than downloading a virtual machine, you can also install an OS directly yourself. Create the container in VB (New button), then install into that. This article reviews installing a distro from Ubuntu directly.

You can install a wide range of other OS’s, including Windows and Mac, in a virtual machine – it’s a great way to test and experiment without messing anything up. Or to run old software that won’t install in a modern OS.

Given the end of Windows XP’s support in April, it will soon no longer be safe for web surfing and other Internet uses but it may still have a role for old software in a virtual machine. Fred reviews installing XP into a virtual machine and the VM backup process here. (article free for subscribers) If you have an old XP install you’re retiring and want to move it to a virtual machine, you can use the free Disk2vhd. This is especially useful if your old computer didn’t come with system install disks. VHD is a Microsoft virtualization format but VirtualBox can use it.

And if you have some concern this is experimental technology or something, it’s been around for years. If you surf the web, you’ll have used a virtual machine. Many large web sites are run in virtual machines so they can, in moments, shift from one physical server to another when under load.

Backup That Counts

January 25, 2014 at 12:26 am | Posted in Backup, Computers, Internet, Online services, Security, Software | 9 Comments

Many tell me they’re not worried if their computer dies – they’ll just buy another. But gradually as our lives go more digital, we start collecting digital things that are more difficult to lose. The password for the service you paid for. The holiday photos. Your oh-so-carefully prepared resume. Important contact information. The list gets larger and larger.

With that growing body of digital history, the need for decent backup grows. For most people, you want it to be automatic. Set it and forget it. Manual gets forgotten.

At the same time, you want a backup that works. If it’s not reliable or there are barriers to getting access to your key files during a failure, it’s not working. A backup is only useful if it can be easily restored. I’ve seen studies that show even expensive business backup solutions failed in practice the majority of the time.

Software and Data, Local and Remote
There are 2 types of stuff to back up and 2 types of places to put that backup.

The first type of stuff to backup is your operating system (OS) and programs. The key reason to do this is to get you up and running again as quickly as possible. Having to reinstall the OS, all your software, and all the updates can literally take days of your life.

The best solution for software is an Imaging tool. The ones built into modern operating systems (like Windows 7+) are fine. Or buying the well-known Acronis TrueImage. This can be set up to be automatic. Weekly is probably enough unless you experiment with software a lot.

The second type of backup is for all your stuff – all the files you create or receive and store in the digital world. If your needs are simple, the above imaging software may be fine. Just image it all together. If you do this, set the backup for ‘daily incremental’. This will catch all the changes made each day.

The downside of imaging your data is access to that backup. If your system goes down or is stolen, you have no quick access to your stuff inside the backup until you have a similar environment and software installed. Go to your old Vista computer, for example, and you’ll have to jump through hoops to get at your Win7 backup.

A better solution is simple file copy or zipping. Those copies of all your created files can then be accessed at any time by any OS – even a floppy. Cobain Gravity has been my recent free choice for that. Plug in your backup drive to another computer and get to work.

For your most critical files where you want to save current versions more often than daily, I recommend File Hamster. When that file or folder is added to File Hamster, ever time you hit save, it makes an additional copy to the location of your choice. (a different drive) This has saved my bacon a couple of times when a file got corrupted. And this is much more likely to happen on files you use all the time.

The program is not presented as free but if you don’t purchase it after the trail period, it reverts to Basic mode. It’s more than worth paying for though. I wrote 2 articles about it here.

Location, location, location
The first type of backup location should be local, due again to the simple question of immediate access. In your office or nearby on the network. An external hard drive or network attached storage (NAS) are best and not expensive. Different types of backup above can be saved to different folders on the same external drive. Figure on double to triple what you have now for the size of the external drive.

Backing up to an optical disc is useful for long term archives, but is too manual for automated backup. Thumb drives have longevity issues and are again too manual.

Unfortunately, a local backup will not save your files in the event of a fire, major theft or other such disaster. For that you need a secondary off-site backup. But it should be secondary. Automated remote backup still has too many possible points of failure to be your primary solution.

Storing an OS image in the cloud is problematic as it is large and thus takes massive time and bandwidth to upload. Not to mention the cost of the on-line storage. And then if you have a failure, you cannot restore the OS from the cloud. You need an OS to get to the OS.

The simple off-site solution for the OS is making a periodic image to portable media and storing that safely off-site. Then you can still restore if your local backup solution fails. It’s a bit manual, but ensures it’s easy & workable.

The focus of your automated secondary backup is for your critical files. On-line file-sharing and backup services have been growing in leaps and bounds. I even researched setting up one myself. But they also have issues. Make sure the service is suited to the task. Some sites delete your files automatically after a certain period of time. They’re not designed for backups.

Copying your critical data over the wide open Internet is akin to sharing – not a fine idea. Some suppliers may add encryption, but make sure it’s also encrypted in transit or you’re exposing your content where it’s most vulnerable. Complicating your choices are the cost and that some use quite proprietary techniques. This can again create access issues in the even of system failure.

In a recent article by Fred Langa, he introduces an alternative solution. You use the online storage of your choice. And you use a local pre-encryption tool that automatically encrypts, then uploads to that on-line service whenever you copy files to it. He used Boxcryptor. (requires .Net4)

You set up Boxcryptor and point it to your on-line storage. Then you set up a secondary backup routine in your backup software to copy to your designated Boxcryptor folder on an automated schedule. Backup to Boxcryptor to On-line storage. Voila – automated and secure on-line backup. The basics are free for personal use.

From a convenience and recovery standpoint, those encrypted files are then available from all platforms anywhere – Android , Mac, & PC.

Be sure the size of your backup routine is less than the size of the on-line space you have. BoxCryptor does allow you to connect multiple services. Also make sure you’re backing up to the virtual drive, not the BoxCryptor.bc folder, or the files won’t be encrypted. Same with decrypting them – get them from the virtual folder or they won’t be decrypted – they’ll just be gibberish. PCWorld talks about using BoxCryptor here.

Fred’s article talks about using it with Skydrive. Boxcryptor supports a wide range of on-line providers, including Dropbox, Google Drive, Box and many more.

Just make sure you use LastPass or some other tool to securely store that unrecoverable Boxcryptor password. In a place that can be accessed any time. Otherwise, you’ve added a key point of failure.

I also reviewed several other encryption options. Some of the on-line storage companies are offering software to do much the same with their own tools but reviews said they were slow. Several tools only work with the big 3 or even only with Dropbox.

And then there’s ownCloud. A little more geeky, but it lets you create your own on-line storage in whatever web server space you have available (assuming your web host is OK with that). It can also manage other sites, mount webDav supporting services like Box, DropBox, GoogleDocs (which it will also open) and supports FTP. It will also give you a cross-platform tool for accessing files, calendar, contacts, bookmarks, galleries and so forth.

Blend that with Boxcryptor and you have your own custom solution.
Happy computing,

Driver Updates

November 7, 2012 at 1:34 pm | Posted in Backup, Computers, Hardware, Online services, Security, Software | 1 Comment

Keeping your computer up-to-date helps keep it reliable, more secure, and bug-free. Windows Update will keep Windows and some related tools like IE current. Even if you don’t use IE, you should keep it current due to its close ties with Windows. Tools like Secunia PSI help review your other software, including links to the makers updates. You especially need web tools like Java, Flash and Acrobat to be current.

But what about drivers? Drivers are pieces of software that communicate between computer devices and software. They translate what the software needs into hardware commands and vice versa. A printer driver, for example, allows a Word document to be converted into a paper document. (printed) Every component inside your computer and every item you attach to it needs drivers to work. A typical computer has many, many drivers.

There is some debate about updating drivers. Some feel that if it ain’t broke, don’t fix it. If your computer is working fine, don’t mess it up. Others note that driver updates can include bug fixes, security patches, and added features. An ideal tool would allow us to  easily update drivers but roll them back if there’s a problem.

Worse, updating drivers can be a fiddly job.

First you have to exactly identify the device you need drivers for. This means the make and model of the original component. Free tools like SIW can help a great deal. I can remember having to open up the PC and look around with a flashlight for the part’s model number.

Next, you have to go digging in the manufacturer’s web site to look for updates. (Support, Downloads section typically) By manufacturer, I mean the original maker of the component, not the assembler of the PC, like Dell or Sony. While the assembler often has component updates, they don’t usually keep them updated for more than a few years.

And you have to do this for each device, not knowing if it needs updating or not. Complicating the issue is web sites and software that purportedly offer to help but are actually scams to feed you ads or infect your computer. You can see why many don’t bother.

Recently, a friend’s computer was randomly blue-screening with little consistency. After checking the hard drive (chkdsk), system files (sfc) and memory (diagnostic), I looked at driver update utilities.

Several forums recommended the free SlimDrivers. It not only found out-of-date drivers but it offered links to manufacturers updates. (not its own or second-hand sites) Click the link and it will ask you to set a restore point (recommended) and will back up the old driver. Then it downloads and starts the install which you take over to completion.

I noticed a few people complained it wanted you to reboot the computer after. This is normal. For Windows to enable a new driver, you have to restart the computer. It’s also a good idea to reboot as you go along rather than in one big go after a lot of updates. Then, if there’s any issue, it’s easy to roll back the problem. While update problems are not common, they’re much easier to fix this way.

A few notes on installing Slimdriver:
– When installing, it preselects installing AVG toolbar. Just deselect that. (all too common)
– When you finish installing, deselect Run Now. It won’t work well without Admin privileges. Run it after from the Start menu so you can approve them.
– It sets itself to start with the computer. You really don’t need driver updates except occasionally. After starting the program, select Options (the toolbar gear) and deselect ‘Run at Windows Startup’. Then click Save.

As this Major Geeks intro indicates, start at the top and work down. Some drivers will update in groups. And note that the Uninstall menu is for uninstalling driver updates, not the program.

I would recommend getting Windows updates from Windows rather than using other tools.
Happy computing!

UPDATE: ran into a little gotcha on one system that it’s good to be aware of. System started bluescreening some weeks after driver updates. Analysis pointed to a driver issue and the IRQ pointed to the video chip. Video driver was current. However, an ATI driver for an ATI chip may not actually be the right one. If it’s an onboard multimedia chip, most common in laptops, the scenario might be specific to the maker where a generic chip-manufacturers driver is unsuitable. Make sure your video driver is coming from the system maker rather than the chip maker in that case. When I installed the seemingly older Sony video driver for that system, the problem was resolved.

Tools for Going Digital

November 2, 2012 at 5:03 pm | Posted in Backup, Computers, Hardware, Media, Movies, Music, Software | 1 Comment

Gradually over the last few years, I’ve been digitizing my stuff. It takes less space, it’s easier to find, it can travel with me, and I can work and play with it immediately. I can also re-purpose it, like turning photos into a screen-saver, music into a play-list, and school notes into a reference library. Files are all one style, not stored by media (records, tapes, etc.) or misfiled or needing yet another device to play or piece of furniture to store. The computer becomes a repository of my life.

After getting a digital camera, photos were the first to migrate. Mostly I used an HP scanner with a photo feeder. If you have the option, a quick programmed action in Photoshop to colour balance, despeckle (dust), and sharpen left a polished  job reasonably quickly. All is now sorted in folders by year, month and event.

I found a film projector at a garage sale for the old super8 films and videotaped that. A little klutzy but the films were in rough shape and many edits had broken. Home videos I then turned into DVDs using a DVR. The Panasonic model had a built in hard drive so they could be digitized then sorted onto DVD’s. That model DVR died prematurely but fortunately after conversions were done.

Another project was all the various music media, including old albums and cassettes. The free Exact Audio Copy is best for CD’s (add Lame for MP3 output). Audacity has the pop & noise removal and editing tools for older media. If you have music you own that you don’t have the device to play to digitize it, use the power of Google to find replacement files. Google allows you to search for file-types using specific search commands. Tools like Gooload and GoogleMusicSearch make this easier. Read what the second has to say about the technique and spam sites. You’re looking for plain directories of stored files, not graphical promotion and spam sites. And you’ll also want to check the conversion quality (bit-rate) of the source. (R-click file, Properties, Details, Audio Bit-rate)

I also tackled the family photo albums, some hand-tinted going back 100 years. As the photos and memorabilia were glued to the pages, this required a large format scanner.

There are now 2 types of scanner lights. Traditional bulb scanners like Epson’s are great for art and professional uses where there is some depth of field (focus range) and colour precise imaging is required. Newer LED scanners don’t have the depth of field but are fine for the flat stuff and much less costly. You’ll see reviews are stratified for LED scanners as some are caught unawares by this difference.

I got a Scanexpress A3 1200 for this project. Now, many family members have copies of the old family albums and we don’t have to worry about where to eventually store the big pile of crumbling, fading albums.

Slides, I thinned out. The slide attachment for my HP scanner illustrated how the dust on slides is massively magnified and way to much labour to fix. The old travel slides were mostly scenery and had bleached out. Some of the best had been printed anyway. I picked out the very best of the rest and had a photo lab handle the conversion. The cost can add up but a pro shop has the gear to clear dust and do it well.

The last big project was all the paper. Binders of school & course notes, workshops, family records, business cards, writing, recipes, references, correspondence, and on and on. Very little of it needed to be kept in paper form, filing boxes and file cabinets. And some of it would be much more useful if it was searchable.

Enter yet another scanner. This one, the Fujitsu ScanSnap S1300i. While the above could do the scanning, it would take far too long. My old company uses Fujitsu scanners to scan thousands of pages of documents every day. They’re a real workhorse but not inexpensive. I was pleased to see this one at a reasonable cost but remarkably full-featured and smaller than a loaf of bread. Just flip open the lid and it turns on. Stick in the paper – from business card size to 8.5″ x 14″ – and push start.

In one pass it will:
– scan both sides of the page if they have content
– straighten the image if its sideways or a bit crocked
– determine if it’s B&W, gray-scale or colour content
– determine the page size
– combine each batch into a PDF
– and more.
The unit will also scan to email or on-line storage which can be synced to smartphones. The list goes on and on.  (see the above link for more)

That PDF can then be made searchable with the built-in Abbyy Finereader OCR (Optical Character (text) Recognition) tool. You can set it to do this automatically but I’d recommend this be a second step after scanning. You can start OCR and scan more at the same time. Also you can skip messy handwritten documents, images and other files unsuitable for OCR.

In addition to being searchable (quick find), an OCRed scan can be used to cut and paste quotes, though the quality of OCR text will depend on the quality of the original. Expect a few typos.

Documents that have been bound or hole-punched together may be prone to stick together. In this case, load a page, then the next, and the next a little ahead of the feeder. This ensures they don’t bind and the scan is complete. I’ve done hundreds of pages in a single file this way.

PDF editing tools may be useful after if you need to combine or separate  PDFs or insert pages. You can find free recommendations through here. Note to OCR first. Some tools will change the Meta source info of the PDF. ScanSnap Abbyy will not OCR files identified as from other sources. (that’s a more expensive product)

Now that I’m caught up, it’s easy to convert new documents. Just flip open the top, stick in the pages and press start. By default, the file-name is the scan date & time so I rename and file it after. No boxes of archives and a mostly empty filing cabinet. I can also use it to scan photos so it’s become my day-to-day digitizer.

And of course, order your statements and such digitally rather than in paper. Then you have less wasted paper to scan. I get little postal mail now.

You do want to learn to organize files on a computer. A heap of dated scans really doesn’t serve you well long term, though you can use Search to find content. Give the folders and important documents names with dates for faster finding. File them in a sensible folder structure. See more in Digital Filing Cabinet.

Of course, if you go digital, make sure you have an automated backup system. For myself, I copy off the assembled scans to DVD for archive and have an automatic daily backup to an external hard drive. If it’s important to you, store those DVD archives in another building.

For Windows 7, the imaging tool included works well. For an extra data backup tool, I use the free Cobain Backup. Both to an external hard drive and both scheduled.

Hope the suggestions are useful.

Loosing FAT for NTFS

October 2, 2009 at 8:23 pm | Posted in Backup, Computers, Software | Leave a comment

Most people with Windows XP or above nowadays have an NTFS file system. The data on their hard drives is stored with this structure. (Macs, Linux, etc. use varieties of UFS, descended from Unix) However, some people who have had XP for awhile began with FAT32 for compatibility with older computers, utilities, and so forth. NTFS is recommended for hard drives over 400 MB, which is most of them now. But not for Flash drives. And some do recommend FAT32 for other solid state drives.

more file system info

It’s now old news to bring up a FAT32 to NTFS migration but I ran into a technique recently that can make it much more effective. But first, lets review why.

NTFS has a number of advantages
– reliability – the file system is more robust and includes hotfixing and recovery.
– It will handle much larger files than the 4GB limit of FAT32, like those new HD movies
– Like Linux, naming is case sensitive, so FILE.txt and file.txt are different files. It also time stamps last accessed time. (POSIX)
– cluster size – it uses 4K rather than 16 or 24K for a small file (each 1K file takes a cluster, wasting disk space)

The last one is the key issue in a conversion. To get the most out of NTFS, you want those small clusters. But if you’re converting from FAT, it may not be so easy. Ideally, you can move the data to another drive, reformat the partition to NTFS, then move the data back again.

Data Drives:
The procedure I use for Data drives:
– clean out the junk
– defrag* if it’s been awhile. Moving the files to NTFS does some defragging but this will make the process faster if there’s a lot of fragmented data.
– copy the files off the partition (backup)
– reformat the partition/drive to NTFS 4K
– move the files back
– recheck in Defrag to polish
– turn off indexing – (right click the drive, Properties, uncheck Indexing). More on indexing below.

Note that this is a detailed process – some of the steps can take awhile. It depends on much much stuff and how tidy it is.

*For a free Defrag tool, I’d suggest MyDefrag (formerly jkDefrag). It uses the Windows API but is faster and much more through. If you really want to organize your data, try it’s Monthly Optimize – but plan to run it overnight.

If you’ve been meaning to adjust the partition sizes, format time is a good time to do it. The XP Disc Manager is primitive compared to a good partition tool. They let you change sizes and move without data loss. Some free ones are better but a good tool allows you to boot from a CD and do it outside the OS. For that I’d suggest Acronis PartitionExpert, now called DiscDirector or BootIT NG. The second is cheaper but geekier. (The long recommended PartionMagic is not what it once was)

If you don’t already do it, it can make your life a lot easier to separate your data from your programs. Programs need occasional Imaging. Data needs regular backup. Much faster if you don’t have to go fishing. You can move My Documents, email and more to the Data drive for easy backup.

Also, it’s worth mentioning that older computers and Macs can’t fully share an NTFS system. You may want to keep one small partition on FAT32 for sharing.

If you make enough of a change to a hard drive, you may need to reboot Windows after it’s “found new hardware” post change.

Boot Drive:
Your Boot or system drive has a different issue. You can’t just copy all the files – some don’t copy. My usual recommendation to use an Imaging tool that mirrors the drive is not suitable in this case as we’re changing file systems. Many Backup programs leave out “in use” files, making a recovery useless. While you can use a CD based backup that can lock C and get it all from outside of Windows, there’s an easier way.

Here, Alex Nichol suggests a feature of BootIT NG to realign the drive clusters so they’ll convert efficiently. (you can also make a bootable CD) Then you can use Microsoft’s built in Convert tool.

In my own case, I discovered that the format tools I’ve been using have already been building small clusters. You mileage may vary – I’d suggest you take the step.

For the Boot system:
– do a standard Image backup. (if you need to recover from this, it will be back to FAT32)
– align for NTFS per (above)
– defrag
– use Convert
– turn off Indexing (drive properties)
– check Defrag to polish
– Do a new Image backup from NTFS

You could do something similar for the data drives, but I found the above faster. However if you have a very full data drive with nowhere to move it, use the boot system process. It worked nicely for my full Backup system.

Pagefile – it’s also worth mentioning the pagefile on a boot drive. It’s quite large and can be very fragmented. (mine was in over 800 parts) If you move the pagefile to another drive during the defrag, you can put it back in open space. (My Computer, Properties, Advanced tab, Performance – Settings, Advanced tab, Virtual Memory – Change)  You can also try PageDefrag, a free tool designed for the job.

Note – old ideas about putting the pagefile on another drive or splitting it are no longer valid. It should be on the first partition of the fastest drive (usually the boot drive) and you can let Windows manage the size.

The Windows Indexing Service tends to bog computers and fight with anti-virus, etc. When you do an NTFS conversion, Indexing is turned on. Turning off indexing on each drive is just the first step.  (Thanks Fred Langa)
– Start, Run, ciadv.msc
– right click on the service and stop it if necessary.
– right click and delete the catalogue(s)
– close
– Start, Run, Services.msc
– Right click on Indexing Services, Properties, set it to Disabled.
– close
If you prefer to keep the Indexing service on, be sure to only be indexing drives you expect to search – the Data ones. You can search on line for ways to use it better and overcome problems.

Happy Computing.

The Ghosts of Computing Past – Part 2

October 1, 2009 at 2:38 pm | Posted in Backup, Computers, Hardware | 2 Comments

Sometimes the weirder problems are not software at all but rather device problems or conflicts. While you don’t have to worry about IRQ settings and such like the old days, the device drivers sometimes have to be updated to work with the evolving computer. Drivers are basically bits of software that communicate between the operating system and the hardware.

To see if you have hardware issues, right click on My Computer and select Manage. In there, you’ll find a bunch of tools. First we’ll look at Device Manager.

Devices with problems will show with a yellow or red mark. Or if the computer doesn’t have drivers for the device, it will show as a yellow question mark.

If you know the device is long gone, right click and select uninstall.
To fix the item, double click to open it. You can see the possible issue, driver version and other details there. If it’s telling you there’s a driver issue, click the Driver tab and click Update Driver.

You may first want to do a web search for the drivers online. Best to stick with the makers web site first. Typically, it’s under Support, Downloads, model number. You can compare driver versions with what’s installed although developers have a bad habit of using several numbering systems. You may also find that the makers drivers will be a big improvement over the generic Microsoft drivers that may be in use.

Another place to check for issues is the Event Viewer, found above Device Manager in the Management console. Particularly, you want to look at System. Red flags indicate a problem, yellow a warning.

The error 4226 I mention in TCP Connection Limits is one example of a yellow. A security measure was reducing some Internet performance.

If you’ve changed hardware over the years, you’re bound to have some junk collecting. I used to have a video card with a TV tuner, for example. Even though the hardware is long gone and the software uninstalled, it left vestiges that caused occasional video playback issues. It acted like I was missing a codec (a compression-decompression routine used to shrink audio and video file sizes like MP3) but codec tools didn’t help.

Turned out to be the vapours of hardware past. The old services were being called but “failed to run” as the device was not present. More so, neither was the service. As it was time to upgrade the video drivers, it was time to clean the space and tackle the issue.

As with other areas of Windows, Device Manager does not natively show you everything. So if you’re trying to fix something you can’t see…

This site tells you how to turn on the visibility of hidden devices. A simple command line setting change, then you can show the hidden items in Device Manager from the View menu. (each time you run it)

What you’ll see is stuff you shouldn’t mess with, like chipset devices with numbers, plus hardware you have at some distant time plugged in. The ghosted entries are not currently “installed”. You’ll recognize some things you use occasionally, like a USB device. But also lots of real ghosts. If you no longer have the hardware, you can right-click and uninstall it. I found quite a few ghosts this way, especially for the old video card. Each software update seems to have added another version. I also found 5 hidden devices with problems, none of which were installed. 4 of them were now causing System Errors – with no hardware and software on board.

After you finish uninstalling the old devices, reboot the system and recheck that the System errors have ended.

If not it means your Registry is still calling them. Back up your Registry either from the File menu of RegEdit (export) or by creating a System Restore Point. (Accessories, System Tools, System Restore)

Then you can use a registry tool to try and find the entries that are still calling the long gone hardware. Most registry cleaners search for all issues but don’t seek missing hardware. The free RegSeeker has a Find Tool you can use to search by product or maker name. It also has a backup tool and a checkbox to backup changes before deletion. Or you can just use Find in RegEdit. (from the Run command)

Just don’t get too aggressive or you may have to reinstall some stuff. Some software is simply not coded to standards, thus appearing to be an error. But if the entries refer to old or departed stuff, you can clean.

This combination of clearing hidden devices and old registry calls did the trick. No more system errors.

For software cleanup, see Part 1

The Ghosts of Computing Past – Part 1

October 1, 2009 at 2:22 pm | Posted in Backup, Computers, Security, Software | 2 Comments

If you’ve been using your computer for any period of time, you’ll find it tends to collect stuff, just like your home. And just like your home, you have to take out the trash and recycling, do some cleanup, and every so often, do a larger purge.

The challenge with computers is that unless you store everything on your desktop, the load is hidden away. Over time, it will begin to slow your computer down and create problems.


In the latest Windows Secrets newsletter, Scott Spanbauer reviews the steps for preparing a computer for Windows 7. Largely, this is preflighting and some more serious maintenance. I add a few other bits for a good seasonal cleaning. Work through the list step by step.

1- Update – get your software current. This gets you bug and security fixes. Secunia makes a couple of great free tools – you can use the basic Online or download the more advanced PSI. They’re reviewed here.

Don’t be surprised if you have some programs that are creating security issues for you, including ones you’d forgotten about.

2- Uninstall unused programs. You may have lots of drive space, but many add junk to the startup, filling memory and slowing your computer. Tools like the mentioned Revo Uninstaller can help with problems or hidden things.

3- Manage what’s starting with your computer. Do you really need all that stuff running all the time? This will help unbog the system and empty some of the taskbar. Use a Startup Manger like these or if you’re a little geekier, try Autoruns.

4- Run Disk Cleanup in Accessories, System Tools. This gets not just your trash, but a variety of other Temporary files that may have been left behind.

5- Run Scandisk. This is a little more hidden. In My Computer, right click a drive and select Properties. On the Tools tab, under Error Checking, click Check Now. Select the first checkbox to fix errors, the other for a more through scan. It may want you to run on a reboot.

6- Run your AntiVirus and Antispam tools to ensure nothing snuck in.

7- Defrag the drive (also System Tools). This can take a little time if you’ve not done it for awhile. Some reboot into Safe Mode and run it there, unencumbered and faster. You could also choose a free tool like MyDefrag, reviewed here.

8- Set up a backup routine. Without a backup, you could become more virtual than you planned. I have suggestions here.

If you’ve never learned to file digitally, take a look at The Digital Filing Cabinet.

Next we explore Hardware ghosts – Part 2

[NOTE – see comment on Boot log and startup below]

Optical Disc Quality – Recordable DVD’s and CD’s.

September 11, 2009 at 2:23 am | Posted in Backup, Media, Software, Technology | 3 Comments

For awhile I’ve wanted to write a decent article on buying good CD and DVD discs but straightforward info I could refer you to was hard to find. Rare studies were very technical and often out of date. Amazon had one they pulled.

I did a detailed study of this a few years back as we needed archival quality discs and the major supplier, Kodak, had stopped making them. The best were MAM-A Gold Archive with an expected lifetime of over 300 years. In fact, they were the only ones that met the government spec. I got them through a small importer in the US.

But these are pricey for more modest uses. Most of us need something that will last more than a couple of years but we shouldn’t need over 50. Taiyo Yuden is one of the most recommended (and oldest) disc makers and has ones said to last about 70 years, but they’re usually sold rebranded by others.

And therein lies the rub. The store-bought disc brands don’t typically make their own discs. Any given brand with the same packaging can be a top quality disc or junk. Brands are thus a poor gauge. Tricks like gold foil and “made in” may or may not be good clues. Some brands have gold tops but the substrate is actually silver. The only way you can really tell is to read the Media ID of the disc. That tells you who actually made it and it’s type. And that requires opening the package, putting a disc in the computer and checking it with a utility*.

First thing to do is to skip the discount brands. While cheap discs may work, they tend to have a much shorter lifespan before they start to deteriorate. I’ve also had trouble with certain brands in cheap DVD players.

Many of the recommendations you see out there are opinions based on past experience with burning certain specific discs. But is that useful if the discs become coasters in 6 months? Or the brand varies what they put in the case? Verbatim got in trouble for this but has cleaned up it’s act.

Recently I ran into this review of the Top Ranked Blank DVD Media. It explains quality levels, reviews brands, and where to buy. Also what the Media ID means. The data is not entirely consistent across sections but does offer some good information.

VideoHelp’s DVD Media table illustrates further how variable DVD’s are. For example, search the list for TDK and get 70 variations. Verbatim 57, including 5 DataLifePlus branded -R 4.7GB discs. 4 are good, 1 bad. It also illustrates how even the good discs are not supported by everything that’s supposed to support them. You may find it useful to review your Media ID’s here.

Generally speaking, these brands are quite good: Taiyo Yuden, Sony, Mitsui/MAM, most Maxell
These brands vary: Verbatim, HP, Imation and especially Memorex.
The worst brands won’t even burn reliably, making them useless.

I’ve had little issue with Verbatim and Maxell from brand stores like Staples. The Verbatim’s I’m using now are quality MCC from Mitsubishi. The Maxell’s are ProDisc that are almost as good. I’ve had poor experiences with Memorex and TDK. Horrible with a few minor brands.

Looking at the suggested suppliers of the Top Ranked article, we can see it comes down to the quality mindedness of the retailer because the customer has no real way to tell in the store. No “ingredients list”. Especially with DVD’s.

*Tools for checking the Media ID:
In the Nero suite, DiscInfo is good.
An earlier version is available free.

DVD Identifier
They also offer a downloadable ID database

Version 6 is advertised as “free” in some places but it’s actually a 14 day trial. If you install this first, then install an earlier free version like 4, 4 will also come up as Trial. Thus look for version 4.

See the Media ID Quality Guide section or the database above for ID reference .

Search VideoHelp for more

-R vs +R DVD’s
Recordable DVD’s introduced 2 competing standards, DVD-R and DVD+R. Basically, they are 2 ways of storing data on the disc.

+R is a few years newer and technically superior. If you are using the discs only on a computer such as for data backup, +R is slightly superior. More so for archival purposes.

This article goes into great detail on why he thinks Taiyo Yuden DVD+R discs are best for long term archive. (they don’t meet all gov’t archival specs though)

If you plan to use the discs for videos playable in DVD players, you want to use DVD-R. While newer DVD players often support both, older ones only support -R and some of the newer ones don’t support +R consistently. It may work on your player but not on the friends you send it to for example. Good branded DVD-R are thus best for Video DVD’s. They are also often slightly cheaper.

Note that for DVD player playback, you also have to encode the video in MPEG2 or DivX (if the player supports it). Burning software that records to DVD or Blu-Ray doesn’t mean it will play in a DVD Player, just that it will burn.  Most players won’t play data discs with various video files on them. They require a specific file format and structure. Thus, you want to create a Video DVD, not just burn to DVD.

Suggested free software for burning Video DVD’s.

Nero 9 Free for burning and copying. This does not include Video DVD creation but may be useful for copying, etc. Some people love Nero, others consider it bloated. The full suite has quite a wide range of tools including Video DVD, menu creation, etc. I usually use Pinnacle for DVD building. Note that DiscInfo is available in the free version as an optional update from the Control Center.

CD Media
CD media quality is very similar to the DVD suggestions above. Here is a good review of CD-R quality and makers.

Further discussion on recordable CD’s, though more subjective. The article includes links to CD Identifier utilities, though the above should handle both. (DVDIdentifier does not mention support for CD’s. The others do.)

BTW, if you use stick-on disc labels, these are known to dramatically reduce the lifespan of discs. (the glue) I’d recommend you copy any such discs you want to retain to newer media.

UPDATE: some of the latest BluRay recorders will handle burning the new M-Disc format. This is quite different from the usual marking of dye on other recordable CD’s & DVD’s. It’s more akin to a pressed disc, like a mass-produced DVD movie, and should last substantially longer. Notably, the discs are also semi-transparent.

Next Page »

Blog at
Entries and comments feeds.