SPWeb.Groups only includes Groups that have permissions. SPWeb.SiteGroups includes all.

And another reminder: SPWeb.Groups only includes Groups that have permission to any object in the SPWeb.

If you want to create a Group that merely serves as a Container for Users, it will not show up in SPWeb.Groups, and calls to Groups.GetById() will fail. If you do not want to add some dummy permissions, use SPWeb.SiteGroups instead, which contains all groups in the Site Collection.

Backups and Me

mediasmartYesterday I wrote a small article about Backups, in which I said that I wanted some more. I do not only have a PC, but also a Laptop which I use infrequently, but I still like to have it backed up. Also, I wanted to easily share data between the PC and the Laptop and even work on the Laptop while the PC is turned off (which was not possible since my SVN Repository is on it). Essentially, I wanted a small internal file/web server for quite some time now.

So at the end of 2008, I decided to get an HP MediaSmart EX470. This model is about to be replaced with a newer one, which means that the 470 is available as remaining stock relatively cheap in some places. Essentially, it’s a small PC running Windows Home Server. What makes it so special (apart from the nice and small design) is the way Windows Home Server works. First of all, the machine takes 4 internal hard drives, with one 500 GB already built in and three free slots. There are also 4 USB Ports and one eSATA Port for extension.

From all your hard drives, it creates some sort of “Software RAID 0”, but without some of the disadvantages. In essence, it will combine all hard drives into one big hard drive. You can mix and match any sizes. In my example, I have added three additional hard drives, all of which are different. Unlike a RAID 0, if one hard drive dies, you still can access the data on the other three hard drives. Here is where Windows Home Server really shines. As you see, it combined the 4 drives to one logical volume of 3.64 TB.

mediasmart2

Sorry, it’s in French, but the legend below the diagram says: “Shared Folders: 1.4 TB, Duplication: 11 Gb, PC Backups: 261 GB, System: 21 GB, Free Space: 1.9 TB”. Let me start talking about Duplication. As this is a server, you access it through Network Shares, i.e. \\server\Music or \\server\FilthyPumpkin. When you copy files to these shares, there are stored on one of the hard drives. I do not exactly know how this is determined, but that is not so important anyway. What is important is that if one hard drive dies, you lose all data on that drive, while still being able to access the data that is on the other drives. However, some folders are more important than others, for example my e-Mails or Photos. What you can do is to turn on duplication for a share. This means that every file stored on a network share is not only stored on one, but on two hard drives. So should one hard drive die, you are still able to recover the duplicated data. Of course, this also means that data on these shares take up twice as much space since they have to be stored twice.

mediasmart3

mediasmart4

Okay, so it’s a nice file server with a smart file storage. What else? It automatically makes backups of your PCs. You install a little tool on your PC which connects it to the server and from now on, it will back up all your files every day. Yes, including Windows System files and other stuff. There are two functions that make it great: First, it keeps a multiple backups. It will only back up changed files once, so it is a very space-efficient. Second, restoration is really nice. If your Hard Drive dies, you put in a new one and boot from a Rescue CD. This CD connects to the Server and presents you with a list of backups. Simply select any date, and your system will be restored. And I mean completely restored, including Windows etc. You you can completely re-set your PC without having to install Windows first. That being said, it will not allow you to restore the backup to a different PC. If your PC completely dies and has to be replaced, you have to start from scratch, but you can get files off that backup. You can also exclude some drives from being backed up, in case you do not want to waste space by backing up that 500 GB Scratch Drive in your PC which only holds unimportant and temporary files anyway.

Windows Home Server is essentially a modified Windows 2003 server, but it does not have any Domain-compatibility. You can neither make it a Domain Controller, nor can you join a domain with it. You can install some software on it though through remote desktop, but keep in mind that the EX470 only has 512 MB, which limits it a bit. I am running SyncBackSE on it to make a copy of my Homepage every day, and this works great.

I also talked about off-site backups yesterday. Sadly, there is a little problem with WHS: Since it’s essentially Windows 2003, some of the consumer oriented backup services do not run on it. For example, Mozy Home only supports Win2000, XP and Vista. They do have a Mozy Pro Version that supports Windows 2003, but that is more expensive. Carbonite also only states Windows XP and Vista, with no mention of Windows 2003 (I have not tried if it works in WHS). One service that works really nice is JungleDisk, which works together with Amazon S3. The problem with S3 is that it is essentially a content-delivery system, which makes it quite expensive for backing up more than a few Gigabytes. There are many more providers, but to tell you the truth, I have a problem trusting any other than those three for now. After all, backing up to a provider means that I am essentially giving my sensitive data over the internet to some other company, so I want to be sure that the data is properly encrypted and secured, and unfortunately some providers do not have a proper security in place. So no off-site backup for now, but I will continue to look at suggestions for this, also because I really want my important shares to be properly backed up.

So yeah, a good backup strategy can sometimes be a little bit tricky, but still my advice stands: Getting a local backup of your data is easy and inexpensive and I can only recommend to everyone to do back ups.

Backups and You

Okay, here is a story that got me thinking again about backups. JournalSpace.com, a blogging service that was online since about 6 or so years just had to go out of business at the end of 2008. Not because of the economic crisis, not because of decreasing ad-revenues, not because of a lawsuit. They had to shut down their site because they lost all their data. There is a copy of their farewell message available, which outlines the details: They used RAID-1 as their only backup mechanism, but when the database was corrupted (be it malicious or not), they were screwed. Game Over.

Now, it’s easy to point fingers at them and laugh or insult them, but I’m not going to do that since it would be hypocrisy. Why? Because I made the same mistake as well, years ago. On my old company, we sold a server with two hard drives, running RAID-1. We also ordered a Tape backup drive, but that was not available at the time and due to be delivered 2 weeks later. But hey, with two hard drives running RAID-1, what should happen? Unfortunately, the two Hard Drives happened to be IBM Deskstar DTLA-307045. When the first drive failed one morning, we thought “Oh well, we’re going to replace it during the afternoon”. When the second drive failed 2 hours later, the day became very unpleasant…

The worst thing in those cases: We did not lose our data – we lost the data of a customer. And JournalSpace also did not lose their data, but they lost your data – the data of their users. That just got me thinking again about backups and data security. And I want to ask a question: What do you do to backup your data?

Many people do not make backups of the data on their hard drive, and even fewer make backups of the stuff they have on the web. After all, my provider makes backups, why bother? Because it is your data, not theirs. You are the one in charge of keeping backups, not them.

Hard Drives are unreliable. Years ago, it was the IBM Deathstar, then the Fujitsu PB16 drives, now it’s the Seagate Barracuda 7200.11, and tomorrow it may be some other drive. But even if you manage to have hard drives that never fail, you still have the issue of a Virus, accidential rm -rf /, software bugs or anything else that can “soft-destroy” your data. I have lost important data in the past already, so last month I decided that I finally need to get a proper backup.

My first issue was the amount of data. I have a bit more than 1 Terabyte of data to back up, which includes 10 years of software source code, all my music, videos from my video camera and Adobe Premiere projects, my e-mail, countless photos and documents etc. etc. etc. So the first question was: Which data is imporant and cannot be re-created? If I lose all my music, that’s bad, but I can just rip all the CDs again. So no need to back up music. For the Videos, I do not need to back up the ones that I copied from my DVDs since I can also just copy them again. So it essentially boils down to the stuff that it impossible to regain: E-Mails, Self-Made Photos, Videos and Documents.

I decided to go with a three-stage backup strategy. First, I daily back up the data to an external hard drive. No, not RAID-1, but a scheduled synchronization (I use SyncBack SE for that, but robocopy also does an excellent job). That way, if my drives fails or is corrupted, I have a backup of yesterday. This is the first stage. External Hard Drives are available incredibly cheap nowadays, and I strongly encourage everyone to get one for backup purposes.

For data that needs to be archived but does not change and is infrequently accessed, I also burn them on DVDs and CDs. This is stage two and includes Photos and Source Videos. Just keep in mind that CDs and DVDs are not made for the eternity, you should burn a new set every year. Storing them in paper sleeves or a spindle is an absolute no go – they belong in a jewel or amaray case.

The third stage is off-site backups. Now, I know what you’re thinking: Off-Site backups are expensive and only needed for big companies. I mean, how likely is it that a fire will destroy all my stuff? My answer: Maybe unlikely, but after all my stuff is destroyed, It woule be great to be able to get back on my feet again. Also, Off-Site backups are really affordable now, thanks to broadband internet. There are some services that allow you to back up unlimited data. I recommend Mozy. Why? Because they actually took the time to implement security properly, which assures me that they actually care about their business.

Here is a little diagram:

backupstrategy1

Okay, so that covers the data on my PC. What about my WebSites? If you run your own WebSite, you can usually just back it up via FTP. SyncBackSE does that, and so do other tools. Essentially, they just download the whole site every day. If you run WordPress, look at the database backup plugin. I have configured it to send me an e-Mail with a database dump of the blog every week, so I can always restore it and lose a maximum of a week (although archive.org and Google Cache may even save my ass here). If you do not run your own WebSite but are using a Hosted Blog, get familiar with the backup utilities of that platform. If your Blog Provider does not allow you to easily back up your data, change. I’m serious – there are so many blog hosting providers out there that offer an easy way to make backups, there is really no reason to waste your time with a company that does not care about your data.

The above strategy served me well at first, but I wanted some more. Tomorrow I will be posting about my current strategy, but let me just close with emphasizing again that no one else but you is eventually responsible for making backups, otherwise you may land really hard one day.

Const Strings – a very convenient way to shoot yourself in the foot

If you want to have a static readonly string, you have two options:

public static class MyStringTestClass
{
    public static readonly string StaticReadonly = "Static ReadOnly String";
    public const string ConstString = "Const String";
}

The difference is subtle at first: StaticReadonly is like a normal field that gets initialized through a static constructor, while ConstString is “hardcoded”. If you look the Assembly at it in Reflector, it looks like this:

public static class MyStringTestClass
{
    // Fields
    public const string ConstString = "Const String";
    public static readonly string StaticReadonly;

    // Methods
    static MyStringTestClass()
    {
        StaticReadonly = "Static ReadOnly String";
    }
}

So the obvious difference is that a static readonly string is initialized in the constructor (which also means that you can set it dynamically a construction), while a const string is really carved in stone. But the real difference is more subtle. Let’s create a second assembly, reference the first one and add a class:

    public class MyStringTestConsumer
    {
        public void TestMethod()
        {
            string sro = MyStringTestClass.StaticReadonly;
            SomeOtherFunction(sro);
            string sc = MyStringTestClass.ConstString;
            SomeOtherFunction(sc);
        }

        public void SomeOtherFunction(string input)
        {
            // Dummy function to prevent "string sc"
            // being optimized away by the compiler
        }
    }

Compile this second assembly, load it into reflector, and look at TestMethod:

public void TestMethod()
{
    string sro = MyStringTestClass.StaticReadonly;
    this.SomeOtherFunction(sro);
    string sc = "Const String";
    this.SomeOtherFunction(sc);
}

As you see, A const string is not a reference to something. While a static readonly string is still a reference to a field in a class which will be resolved only at runtime, a const string is actually “copy/pasted” by the compiler. What does that mean? At first, it means a theoretical performance increase, because no lookup will take place, hence it is usually recommended by FxCop.

But there is one big caveat with it: Say you have multiple assemblies, one that provides the const and one that consumes it. What happens if you change the two string in the “provider.dll” without touching the consumer.dll? The MyStringTestClass.StaticReadonly will point to the new reference, but the const string will not change, because it had been literally inserted. You will need to recompile the consumer.dll to have the string replaced with the new one.

I have just been bitten by this. I have two provider.dlls, one for a test and one for a live environment, but only one consumer.dll. I accidentally declared a string as const. Needless to say, deploying the consumer.dll to a live environment led to some… interesting… results. So yeah: Consts are really useful, but sometimes a static readonly field works better.

PS: I think the same applies to other value-types like int as well, but I never shot myself in the foot with an int.

Firefox 3.0 – ruining a perfectly good browser

Okay, so I’ve just uninstalled an application. That alone is not really spectacular news, but the Application in Question was the Firefox 3.0 Web Browser. Yes, I am a bit latem given the fact that the browser was released 6 months ago, but I am so fed up with the broken URL bar now, it had to go.

Let me start by the usual disclaimer: This is highly subjective and based on my habits.

So yeah, Firefox’ URL bar. I regularly browse some web pages, for example The Wikipedia, both in English and German. Their urls are en.wikipedia.org and de.wikipedia.org. Also, I am a regular visitor of stackoverflow.com, where I was part of the private beta when the URL still was beta.stackoverflow.com. Also, I regular visit some German news sites like Spiegel.de or Heise.de.

So, what happens if I want to visit the German Wikipedia? I usually type in “de”, press down once since de.wikipedia.org is the first result and hit Enter. Not so with Firefox 3.0: The first result is Heise.de or Spiegel.de, since they END in .de.

When I start to type in “sta” to go to stackoverflow, the first result is… beta.stackoverflow.com, because I visited that site a lot, but that link is now obsolete since MONTHS. And no, I am not going to clear out my cache or history or whatever, just because the browser is too stupid to learn that beta.stackoverflow.com is not being used since 2 or 3 months now…

Oh yeah, and that new download manager… it is as useless as it ever was as a download manager (hint: A good download manager works essentially like “wget -c “, not like “I may just leave that temp file behind, or maybe not, maybe I’ll even resume if the download is interrupted, but… meh, let’s just start from 0 again), but now they even removed that “Saving to: (Folder)” Button that allowed to quickly open the download folder. That change was a really useless change between Firefox 2 and 3, because it removed convenience without adding any value.

Don’t get me wrong: I think that the Firefox guys have done an important and excellent Job. They had success where other failed: In offering a stable and viable alternative to Microsofts Internet Explorer, which is still a very inconvenient browser in Version 7.

But changing core behaviour between releases (and the URL Bar is part of the core of a browser) is something that should be carefully considered, and at least there should be an option. Maybe not in the GUI, but at least about:config should allow me to change behaviour. (You can change some behaviour, but you cannot get the Firefox 2 behaviour back)

I believe I heard that Firefox 3.1 is supposed to change this now, but since there are now viable alternatives, Firefox just got thrown off my hard drive. Ironically, I am now running Chrome, a Browser that I avoided at first. Google recently changed it’s policy, and the default settings are a lot more sane now. I do not know if I am going to keep it, but for now, this is my replacement. Chromes download manager is even worse though, but at least the URL Bar is properly working. And at the end of the day, there is still Opera.

Good luck with Firefox 3.1 and 3.2, maybe we’ll see us again in the future.