AWS RDS SqlServer Native Backup and Restore

Had to learn this yesterday to clone a production environment down to a lower environment. Figured it qualified for a blog post.

exec msdb.dbo.rds_backup_database 
         @source_db_name='xxxProd',
         @s3_arn_to_backup_to='arn:aws:s3:::xxx-sql-native-backup/xxxProd.bak',
         @overwrite_S3_backup_file=1,
         @type='full';

exec msdb.dbo.rds_task_status;   -- till lifecycle=SUCCESS

ALTER DATABASE xxxUAT SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
drop database xxxUAT;
exec msdb.dbo.rds_restore_database
         @restore_db_name='xxxUAT',
         @s3_arn_to_restore_from='arn:aws:s3:::xxx-sql-native-backup/xxxProd.bak';
exec msdb.dbo.rds_task_status;   -- till lifecycle=SUCCESS

delete from xxxUAT.dbo.SensitiveTableStuff;

The Gotcha’s were:

  • Had to set up an option group that added SqlServer Native Backup and Restore to the RDS instance.  It took a few minutes to apply, the RDS instance did not reboot or go offline during this process.
  • Could not restore over an existing database.
  • Learned the hard way that while you can detach, you can’t re-attach a database using SSMS.  Reattaching uses a custom stored procedure.   And detaching and attaching had nothing to do with deleting.

Upgrading my ReFS

I got a larger capacity drive to upgrade my ReFS (Resilient File System, Windows 10 “RAID” array) with.. also, most of my drives are >5y old, and I’m running low on capacity, so I better start upgrading before the world starts upgrading me.

Here’s what it looked like before the upgrade:  (“Manage Storage Spaces” are the magic words to Start->Search for):

image

Now I’m going to save this draft post, open up the computer, and unhook the 931G drive (if I can find it) while the machine is running to simulate a fault.

DANG! I have two 1TB disks.  Oh wait, one is a WDC and the other is a Seagate.   Cool.   The WDC is the one to unplug.

The WDC is unplugged!  It hasn’t figured it out yet.. Opening up the drive … drive opened..  WinDirStat to exercise the array .. 

there we go.  “Reduced Resiliency”

image

Okay, cool.  I’ll power down the machine and swap out the drive.

The Storage Spaces UI is unchanged – the (now removed) drive shows up with the yellow icon.

I have a new drive in regular Disk Management:

image

First, I click Change Settings .. I tried to find a way to remove the old drive, but I couldn’t find one.

So I went with adding in the new drive first.  I clicked “Add Drives” —

image

Once it was happier with the drives, the option to remove the errant drive showed up. 

image

I clicked “Remove”, and it asked me to confirm the drive that I was removing ..

image

This took a LONG time.   From 7:20pm to  9:38pm, so about 2 hours.  Opening up a second Storage Spaces UI, i could see the drive listed as “Preparing for Removal”, which I think meant “I’m going to find all the stuff that was supposed to be on this drive and make sure it is elsewhere”.    (Confirmed – the % used started dropping slowly.) 

The other thing I didn’t realize is I had to increase the size of the storage space to use the new free space in the storage pool.

And… I’m going to post this before its all done.  But basically:    I now have about 3TB of Mirrored Space.  I’m also Resilio-Syncing the important bits to an off-site backup.  So, any two of three things could die, and I should still be good.

Year End, Backup, Archival

image

 

Because I have the time available, rather than just reacting, I’m trying to plan what I’m going to do with my various sources of information.  Its on my head to keep them backed up (in case of data loss or service loss) and archived (long term storage, and also dealing with service loss).

There were too many things!  I’m going with “resilio” (formerly bittorrent sync)  as my offsite, in-my-control backup (if it dies, the files are just plain old files on a hard drive), but i still need to get my stuff from Dropbox / Google Drive etc to resilio.   But how I get the stuff there?

So I added a Risk and Importance column where low numbers = don’t care and high numbers = care, multiplied them together, and got a priority.

Starting the copies while I go do other cleanup stuff around the house…

Amazing DIY Backup Camera Lines with Turn-Indicators v2.0

16aarnI’ve had two coworkers exclaim that this was an awesome idea, so I figure that makes it blog-worthy..   See if I can get an animated gif of this going ..  eh, not that good, but I’ll leave it.

 

How did I do it?

Step 0.1.  Acquire tablet screen protectors.  Beautiful Wifeling, who loves hunting for stuff, found them at a dollar store.

Step 0.7. Apply tablet screen protector to backup camera screen. This involved some scissor work.    Be amazed at how the polarization is just a little different, when seen through polarized glasses.

image

Step 1.   Find empty abandoned parking lot.  Where the parking lines all line up.  In my case, its at my work-building.

Step 1.1  Acquire dry erase markers and sharpie. 

Step 2.  Get the Backup Lines figured out. Line up the car with the wheels touching one line evenly, and then trace onto the screen protector with the dry erase marker.  Note that due to lens configuration its not a straight line, its slightly curved.  If you’re really good, you could probably lift a fingerprint from this image.

imageimage 

Step 3.  Get the hard turn lines figured out

image

  • Start out parked in a parking spot. 
  • Pull out, avoiding hitting the imaginary parked cars next to you, and turning the wheel as far as possible when possible, till you’re in the street and lined up.
  • Trace the outline of the parking spot with dry erase marker.
  • Test it out by trying to back into the spot.  You may have to adjust.
  • Once you’re sure, then go over it with a sharpie.

Step 4.  Profit

I have earned at least 27 self-credits at the good reverse parking jobs I’ve done.  Go me.  I’ll spend them on some ice cream.

Backup Strategy Revisited: v2016.05

A while ago I wrote about using crashplan as my backup strategy.  That didn’t last long.  Crashplan (free) eventually stopped syncing.  I did not have time to look into it.

Then I listened to Mac Power Users Episode MPU318: Backing Up and I got re-inspired to pick up the gauntlet again.

I recently re-acquired my desktop machine at home (it had become my wife’s in an emergency move when her machine started BSOD’ing), and I decided to try Yet.Another.Backup.Solution; this time going back to what some readers had suggested (see comment section here), using BitTorrent Sync.

So far so good

image

(click image for larger view)

  • I keep all active (personal) code bases in their own folder, c:\code,which rely on github, visual studio code, beanstalk, etc – various source code repositories – for their backups.
  • I keep a Dropbox folder somewhere on all my machines.  Dropbox is my main “My Documents” folder, for ad hoc type stuff.   But, it does not get stuff related to a project involving media – for example, a video edit from my GoPro.   That’s too large for it.
    • I chose dropbox because I used them first, and I’ve stuck with them so far.  I believe I’m at 7.5G, free.  There are other services that provide similar…
    • I do an occasional 1-way copy from dropbox to a spot in my big share.  Just in case.
  • I set up a Windows Storage Space on my big home machine, across the top three drives from my former home server – 2TB, 1.5TB, and 1TB … which combined give me a 2TB 2-way-mirrored array.  Any one drive dies, no problem. 
  • I set up BitTorrrentSync Pro (paid) ($40 personal one time) to set up a personal cloud:
    • 1.2TB of data between my home storage space and an external hard drive hooked up to my work computer, over the internet.
    • Several per-year folders (2016, 2015, 2014, etc) that are much smaller that live on every machine I am productive with.  (my convention is /yyyy/projectName for any given project, copy-forward if it goes between years).  
      • I’m doing a lot less video work now, so the sizes are quite manageable.  2014 was the peak, that was 283G.  That’s been pruned down and put in the big sync now.   Most I’m doing for a year now is 15G, because of a GoPro video.
    • If I have a project that exceeds the size of one of the laptop drives, I’ll probably break it down into its own little BTSYNC cloud (“2016ProjectName”) and share that amongst the machines that care.
  • I’m still relying on Dropbox for my phone photo backup, although I am doing a iCloud backup as well.

I am currently not backing up my mom’s computers, because she has 4(!) of her own, with Dropbox set up to keep track of important stuff.    Go mom!  (I need to check on that soon, verify health).

On Seeding Large BtSync Share

  • I started out with a single copy of my files.   USBHD1:\SHARE
  • Copied them all onto USBHD2:\SHARE
  • Walked USBHD2 to the office, bypassing the internet.
  • Set up a BitTorrentSync folder on USBHD1:\BTSYNC1 (initially empty)
  • Sync that to USBHD2:\BTSYNC1 (initially empty)
  • At home, copy USBHD1:\SHARE to USBHD1:\BTSYNC1 – 5 years!!!! (usb 2.0 is not great for disk speed)
  • image
  • At work, copy USBHD2:\SHARE to USBHD2:\BTSYNC1
  • Within 24 hours everything was synced.
  • I compared files against the original shares and there were no problems

Now that I’m more familiar with it, I could have just told the work computer, “really, use THIS folder to receive the files, yes, I know it already has stuff in it”, and it would have worked fine. 

Moving the data from the  USBHD to the StorageSpace was easy: 

  1. Copy the data
  2. Stop BtSync
  3. Unplug the Ext Hd
  4. Start BtSync
  5. Tell BtSync where to find the files again.
  6. Profit.

What Did Not Work Well For Me With BtSync

Using my iPhone to browse my big share Nope, it tried to download the big share to my phone (in selective sync mode) (which does one place holder file for every file) (too many files!)

Selective Sync in general – I did not like the idea of placeholder files.   I’d rather have placeholder directories. 

In Conclusion

So.. there you have it.    I have my mirrored local, I have at least 1 offsite backup, several protocols, and an offline backup in case of ransomware.

We’ll see if its still working fine a month or two from now.

Ubuntu, Headless Mediasmart Server, Software Raid, Samba

I am now several days into this experiment. Its not working quite as I had hoped, but it is working.  So here’s a roadmap/dump, with links along the way:

imageimage

Headless Server Install

I had a HP Mediasmart EX-485 Home Server that went unresponsive. Did not appear to be a drive failure, the O/S started hanging.    After rescuing everything, I had this pile of 4 hard drives sitting around – 750G, 1TB, 1.5TB, 2TB – and I was wondering what I could do with them.   Well, build a server of course!   Hey.. I have this hardware sitting around.. it used to be a Mediasmart EX-485 Home Server…   But it doesn’t have a display, or keyboard, or mouse.   There are places you can order some add-on hardware to do this, but it would cost me money.

I researched a couple of options, the winner was:  to hook up the hard drive (I chose 750G) to another computer, install Ubuntu on it, and then additionally install sshd, and only then then transfer it to the mediasmart chassis.  Luckily, most driver support is built in to the linux kernel, so switching hardware around is not a problem.  

Then I downlod Putty on my windows desktop machine, and use it to connect to the server (formerly named MGZWHS, now named diskarray).

Adding in Drives and Making a Pool

I booted the server up, and checked what drives were available (sudo lsblk), and it only showed me /dev/sda and its partitions.   As an experiment, with the server up and running, I hot-plugged in another drive, and looked again.. sure enough, there was a /dev/sdb present now.

I plugged in all the drives, then went and did some research.  This lead me to LVM (logical volume manager), and there were a ton of examples out there, but all of them seemed to use hard drives of identical sizes. 

At first, I thought I needed to repartition the drives like this guy did – so that I could pair up equal sized partitions, and then stripe across the partitions – But once I got into the fun of it, it became much simpler.

  • create PV’s for each disk
  • create 1 VG covering all the PV’s
  • create 1 LV with –m 1 (mirror level 1) on the VG.  This command went crazy and did a bunch of stuff in selecting the PE’s to use for the different legs, and the mirror log …
  • create an ext4 fs file system on the LV

The –m 1 “peeks” into the physical volumes and ensures that any piece of data is backed up to 2 separate physical volumes – and as my physical volumes are all disks, they’re on different disks.

Surprise, though – it took about 2-3 days for the mirroring to catch up between the drives.   Screen dumps available here:   http://unix.stackexchange.com/questions/147982/lvm-is-this-mirrored-is-copy-this-slow

imageCreating a Samba Share

I then approximately followed these directions to create a samba share.  However, its not quite right yet – the permissions of created files / the permissions of other files, its not quite matching.   However, on my windows machine I can definitely see the files on my server, and its fast.

NB: You can see the .sync files and the lost+found of ext3fs

Syncing Via BtSync  // stuck

I then followed these directions to instal btsync, and then attempted to sync my 1.2TB of data from my windows box to diskarray.  It got part of the way there, 385MB .. creating all the files as user “diskarray” (herein lies my samba file sharing problem) – however, its gotten stuck.   Windows btsync knows it needs to send 1.1G of stuff to diskarray .. they both come online .. and nothing gets sent.     There are ways to debug this – I know how to turn on debug in windows, but have not yet followed the directions for linux – and eventually, I hope to fix it.

// todo: Simulating Drive Failures

I did attempt a drive failure earlier, but that was before the copy was 100% done – so, it was, shall we say, a little hard to recover from.  Later on, I plan on taking one of those drives out while its running and see how I can recover.   Maybe even upgrade the 1G to a 2G, if I ever take apart the external hard drive that I used to have on the TIVO.   What should happen is the lv would degrade from “mirrored” to “normal”, and the unmirrored partitions would become unused.  We shall see.

Figuring out a Sync strategy

At first I was going to draw this all out – I did, with VISIO, but the drawing isn’t quite right pretty enough.  It does reflect the solution I’m going towards. 

image

I have the following computers, with the following uses:

  • “Big” Computer – my main workhorse desktop at home.  It does a lot of Video Editing, Audio Mixing, etc.
    • Currently has USB 3TB drive #1 mounted
  • Surface #1 – my most portable computer.  Usually just edits things in the cloud, occasionally gets an offload of an SD card while I’m on the road.
  • Laptop #1 – My “I would program on this laptop” laptop, which I mostly use for Remote Desktop into my work computer
  • Laptop #2 – this is the laptop that controls the 3D printer.  It has a lot of 3D printer tools on it.
  • Work Computer – for work stuff.   Not my computer
    • Currently has USB 3TB drive #2 mounted

I pretty much use the pattern of:

  • C:\2014\<project Name>\  is where I put all files for a project.
  • I keep the same project name between computers if I move the project around.  For example, when I was doing book design for Dad’s book, that started on Laptop #1, then moved to “Big” Computer
  • I consider “iTunes” to be a project, as well as my local instance of my Dropbox folder.  Unfortunately, these are still in 2013. 

My needs:

  • When I’m working in C:\2014, it needs to be fast and stable.
  • When I’m working in C:\2014, it eventually needs to get backed up to the big backup
  • Not all C:\2014 projects should be on all computers.  In fact, almost NONE of them should be on the Surface, it only has a smallish SSD.  Same deal with Laptop#2, that has 10G free or so after the O/S.
  • The Big Backup should be offsite-level backed up.

Limitations

  • A computer cannot btsync with itself (yet)
  • A R/O folder (like the phone backup destination) cannot be inside a R/W folder.

Options

I thought about something like this:

  • BigComputer 3TB \BtSync1  syncs to Work 3TB \BtSync1
  • I create a sub-sync of WORK:\BtSync1\users\sunny\2014 to BigComputer:c:\2014

It would work, but it would be a bit ugly.  Lots of hops going offsite and then back home to get things backed up.

Winner?

I believe I’ve decided on the following:

  • The large 3TB drives maintain their own sync pool.  
  • The local folders (C:\2014) MIGHT maintain their own sync pool, on a per-project basis.   For example: 3dModels between my big computer and the 3d printer laptop.
  • Every project should end up at Sunny’s Big Computer
  • I’ll use Robocopy on a schedule to bridge the final gap from Sunny’s big computer c:\2014 to USB:\BTSync1\users\Sunny\2014
  • When I bring another machine online at home which can hold large drives, I’ll add it to the big sync pool as well to have a local (faster) backup. (Offsite can lag behind by days when we’re talking video files)

For things like backing up Mom’s important stuff, I’d probably create a USB:\BTSyncMom folder and have that be its own pool, not make it sub- or super- to any other sync pool.   Or, continue to use Crashplan there.

Something I Don’t Yet Have

Windows Home Server gave me the ability to restore a machine to a specific state, covering all the installed software on a machine.  I could do that using a copy of Ghost, or some other Hiren tools.    I don’t yet have a plan for that.  On the other hand .. reformatting a machine these days is “no big deal” for me, with all the data being out in the cloud somehow anyway. 

Fun with Bittorrent Sync

BitTorrent Sync can be downloaded here:  http://www.bittorrent.com/sync

Large Folders

I now have two external hard drives, one at home, one at work, synced with each other:

imageimage

  • I first xcopy’ed one external drive to the other.   This took about 20 arns
  • I then hooked one drive up, and pointed BitTorrent at it.  It started indexing.
  • I took the other drive to work, pointed Bittorrent at it there with the same shared folder secret, and it too started indexing.
  • The two instances of BitTorrent spent quite some time chatting with each other over the network:  “I have this file!”  “I do too!”  .. getting to the point where they agreed that they were in sync with each other.   This chatting and indexing phase took perhaps 2 days.
  • They are now in sync.  They still “chat” at each other every now and then, but the network traffic is minimal. (I think)

Robustness

I then played with the robustness of the sync.  First, I renamed a folder on one –  did it transmit it as a rename?  Why, yes it does:

image

I then turned BitTorrent Sync off on one of the sides (but not the other), and did some more renaming.   I got a mixed bag of results:

image

What it looks like to me is, if BtSync is listening to a folder, it remembers actions taken and can catch up other clients to those actions; however if BtSync is turned off, it takes its new indexing data as “new stuff that happened” (ie, it doesn’t know there was a rename) and thus deletes and adds files as necessary.    In the end, the two repositories are in sync, and the deleted files are copied over to the .SyncArchive folder.

image image

iPhone Backup

imageI turned on the “synch my photos from my iPhone” feature.  It creates a bitTorrent sync source, with only a readonly secret for others to consume.  

image

  • If I delete a photo on my computer, it is NOT resynced from the phone.   
  • If I delete a photo on my phone, it is NOT deleted from the computer
  • The files are not date/time stamped like they are with the Dropbox export.
  • It only synchronizes when I open the BitTorrent Sync app on the phone.  There is no option for background refresh.  (+1)

So far so good, I like it.

Nested Folders

The idea is this:

  • D:\BtSync1 is synched from A to B
  • On the same computer, can I additionally sync D:\BtSync1\Users\sunny\2014\3dmodels from A to computer C as C:\2014\3dmodels?

This way, I can keep specific projects (in this case, 3d printing stuff) synched between two computers, while having the data synced to the offsite backup as well?

Answer: Yes, as long as the parent (D:\BtSync1) and Child (+Users\Sunny\2014\3dmodels) are both Read-Write secrets.    Ie, I could not place my phone backup folder in BtSync1, but I can do what I want above.

Testing it out:

  • Deleted file “LaundryBasketHandleV1.stl” on A (not shown by name in the log file below). 
    • Deleted on B, and C
  • Deleted file “LaundryBasketHandleV1_fixed.stl” on C (“MOLLY-L2013”)
    • deleted on A and B
  • Created a new folder “coldwellbankercoaster” with a bunch of files on B (“SUNNYDESKTOP”)
    • copied to A and C.

image

I like it so far.  Dude, I would pay $100 (one time, multi-install) or $15 (per computer) for this software.  And for right now, its free! 

I am brewing a plan for total world domination.  My world, anyway. 

Crazy Backup Plan using Crashplan

-Original 7/25/2013- 

I don’t have the time left tonight to write this post proper.. so here it is really fast:

  • Hanselman rule of 3
  • 1TB to cloud from WHS Home Server v1 = expensive, no thanks.
  • BuddyBackup hit some problems with scale (possibly) (I tried them first)
  • Crashplan (free) saving the day.

image

Quirks / Details:

  • A Crashplan backup to a USB device, when taken over by sneakernet, can be “Imported” to seed a cross-internet backup.    The caveat is, its not so much an import, as it is a “start using this new location as the place to dump stuff and forget everything you knew before” – which is perfect.
  • Crashplan will not directly back up a network share, but if:
    • You run the Crashplan service as a user who has access to those shares
    • You create a symlink (MKLINK /D) from the hard drive to that share
    • Then CrashPlan can back it up.
    • image
    • image
    • image
  • Crashplan Network bandwidth bottleneck + Schedule to keep relationship with work on a good side.  (And asked for permission first)
  • imageimage
  • Same 3TB Ext HD used for both a WHSv1 Shares backup and a Crashplan Backup.
    • image
  • I had to block WHS from backing up the 3TB Ext HD to prevent needless usage of space
    • imageimage

Its been a fun ride.   Its not 100% synced yet .. but its more than half way there and hasn’t died yet.   Thumbs up

-Update 7/30/2013-

I can confirm that if:

  • you do a backup to a local folder on an external USB hard drive HD1 at \CrashPlan
  • you do an over-the-internet backup to a remote computer that is also saving to a USB hard drive HD2 at \CrashPlan
  • You can then sneakernet HD1 to the remote location, disconnect HD2, attach HD1 as the same drive letter
  • and Crashplan resumes just fine.  Ie, local storage and over the internet remote storage are compatible with each other.
  • Good job on the design, CrashPlan!

However, I did run into a problem at about the 900G mark:

 

Backup, Balance, and Cross Apply

My goodness, the weeks have flown.    I haven’t had the time to properly devote to any one geeky project.. but there’s been a bunch of smaller things going on.   I’ll try to sort them from Geeky-est to Meta.

I used Cross Apply for the first time

Its hard to do better than Stack Overflow, so here: http://stackoverflow.com/questions/1139160/when-should-i-use-cross-apply-over-inner-join

My use case was pretty simple:

  • I had a sproc which would return inventory of stocks held.
  • Anybody who had more than 100 stocks,  I was trying to create an option from those stocks (100 stocks = 1 option) (for scale testing purposes)
  • However, to create an option, I needed a valid combination of symbol, expiration date, and strike price (so that web service lookups against the market would return valid data)

The lookup of what is valid is something like SELECT TOP 1 EXPIRATION, STRIKEPRICE from <stuff we already know about> where SYMBOL=@symbol

The cross apply becomes something like this:

INSERT INTO (new thing) 
SELECT (bunch of old things from S, LOOKUP.EXPIRATION, LOOKUP.STRIKEPRICE) 
FROM inventory of stock S
CROSS APPLY (  select top 1 ... ) AS LOOKUP

I had to reformat my W7 box at home

2013-07-12 09_30_12-IGNEW - dev.mycareernetwork.com_4156 - Remote Desktop Connection
Something went haywire, and the computer would freeze every time a video played on a web page.  Even ads.  I tried all kinds of stuff around uninstalling and reinstalling drivers…  SAFE mode worked ok, but coming out of safe mode = kaboom.

So I decided to restore my machine to a previous known good version, thanks to Windows Home Server.

But I forgot that my machine had a RealTek integrated network adaptor, which the WHS restore CD doesn’t know about.. so no connection for restore!   I tried looking for the drivers CD, couldn’t find it in 5 minutes

So I reinstalled Windows 7 from scratch.   And still no Network driver.   I had to go get the drivers on a different machine, and bring ’em over by sneakernet.   At this point, i was committed, so I continued the reformat.

I currently have a robocopy going from my WHS backup to my new working directory.  My incredibly awesome nomenclature is:   “c:\2013“.  That’s where I put my stuff.    If a project spans years, it gets copied from one year to the other.   One folder per project.

Life Balance

  • Family Reunion Road Trip
  • 2nd swim time with Father in Law.. I’m getting better at this treading water stuff.
  • Not much running.. 5 miles here, 0.6 there, 2 more squeezed in
  • Fourth of July = dogs went crazy while we were not home = broke through drywall to get out of basement and tear through the house
  • Dryer motor getting repaired, thanks to a shoe.   And a dryer vent, thanks to a dog (it was next to the drywall above)
  • Crazy dogs led to walking the dogs in the morning.  I am not a morning person.
  • Today is door replacement day, thank you Lowes
  • Back deck is done!  Thank you Steve Ader.  He does good work.
  • 3 birthday parties … ages:  30, 33, 1.. and an upcoming 86.

Looking forward

  • I’m hoping to get my Android + OBDII set back up again.  I want to see how much gasoline I’m spending over a 2D map.
  • I could write about tracking down some performance problems in EF .. mostly around context.Attach() .. and how I replaced it with a single update statement.
  • I could write about my current project, the “feel” of it, as it went from “we don’t trust the system” to “hey, this is working pretty good”.
  • I need a different home backup solution.    If I upgrade one more hard drive, my WHS will be at reasonable-capacity, and honestly I just use it as a NAS.  I have a 1 year old off-site backup, which is not automated enough for me.    I have only used WHS’s restore the way it was meant ONCE – when my laptop got unstable, and i restored it to itself — and that was a beautiful thing — but once in 6 years?   Compared to 7 or 8 reformat’s (changing OS’s usually, or new owner for a machine)?   … contemplation is needed.

peace out.