Exchange Profile Migration Tool – Update?

Maize Maze

One of the biggest challenges of any Exchange migration is the configuration and re-pointing of Outlook profiles. There has been a tool around for some time called Microsoft Exchange Profile update Tool.

Yesterday the Microsoft download site was updated with a ‘new’ version. I think it’s new because it was only posted yesterday but there is nothing in the package to tell you what is now. The exe is from July and the documentation says it was updated in February, but the properties of the exe still say Version 1.0 and the documentation says that it only applies to Exchange Server 2003 SP1. As Exchange Server 2003 SP1 was released on 11th October I suspect that the text should say that it is an update for Exchange Server SP2, but that’s purely speculation. Anyway here’s the link to the new Microsoft Exchange Server Exchange Profile Redirector. Oh yes by the way it seems to have a new name as well, but the exe is the same .

Get Safe Online

Derelict House

Another home security site – this time from the UK Government. Get Safe Online focuses on three particular areas:

  • Protect your PC
  • Protect yourself
  • Protect your business

It seems a reasonable way of focusing the questions. Seems to be reasonably comprehensive though I can’t see anything on working as a restricted user.

Customer Experience Idea

Haighton Path

Something occurred to me today. I was completing a registration for Jonathan’s laptop when at the bottom it asked the usual question “we would like to send you promotional…” you know the one.

As I had registered with my true production email account that I want to keep clean and away from junk the answer was a definite ‘no’. But then the thought occurred to me. If there was an option to say “yes, but to a different email account” the answer would have probably been yes. I don’t think I’m unusual running multiple email accounts. I have one where I (hotmail as it happens) where all of the advertising goes. I look through it occasionally, but don’t look in detail because I have another account where I receive emails that are important, personal, etc.. If something is important they get given this email account, if it’s not they get the hotmail one.

The registration number of the laptop is important so they get the ‘important’ email address, the advertising isn’t…

Am I the only one who thinks like this?

Structured Active Directory Schema Management at Microsoft

Slow

Microsoft have published another one in their series of documents detailing how they do IT internally. This one covers the whole arena of Active Directory Schema Management. It’s an interesting read.

If you are looking for something that removes the leap-into-the-dark feeling that anyone updating schema gets then sorry but this document doesn’t do that. What it does do is outline a practical industry-standard mechanism for reducing the risk, but nothing that actually removes the risk. They seem to have become confident in doing lots of changes which I suppose is an advantage that they have. Most of us do so few schema updates that we are always going to be wary of them.

In my personal opinion the Microsoft technologies currently contain far too many leap-in-the-dark moments that have the potential to result in massive impacts on the customer base. Schema changes is one of them, group policy changes another; but perhaps that’s what we get when we cry out for more powerful tools.

Meida Center Update 2

Media CenterI have installed the Media Center Update 2 today. All seems to be fine so far. A few little tweaks but nothing dramatic. I noticed something on optimisation in the settings pain – has that always been there, I don’t remember seeing it before?

Hopefully it will have fixed the occasional black screen that we get – it will certainly make Emily happy anyway.

System Center Capacity Planner 2006

York Museum Gardens

Over the weekend Microsoft released a Beta of System Center Capacity Planner from my point of view this is actually a very big step.

For some time now Microsoft have published best-practice type papers, but have relied heavily on the hardware vendors to provide the sizing information. In my experience this has lead to a situation where one of two things happen. If you get all of your hardware from one vendor then you get a nice neatly packaged answer; if you are in a multi-vendor hardware environment then you get a story which is optimised at each layer of the solution. The example that Exchange environments always bring out is the one of storage. If the storage vendor is different to the server vendor then Microsoft will be no help at all and the storage vendor will give you their optimised solution, as will the server vendor. Neither vendor wants to be the problem, so they both over-do it a bit to make sure they are OK. The only way out of this is to do lots of testing, but that’s very difficult and costs more than just buying the over-done hardware. The problem with over-done hardware isn’t the capital cost though, it’s the fact that it’s normally more complicated than the ‘good enough’ stuff and costs more to support.

The fact that Microsoft is stepping out from under this shadow is a big move – it would be even bigger if they were to attach some penalty system to it. A penalty system that works both ways – if we tell you to buy too much we’ll pay, if we tell you to buy too little we’ll pay. That would be a very courageous thing to do but would be a massive boost to customer confidence. It would also make Microsoft’s life easier to though. Why would anyone not buy the recommended hardware configuration and if the software actually works, people would then get services that didn’t require a call to Microsoft every week. Microsoft would then a chance to scale down its support organisation and its reputation for delivering reliable services would increase massively.

Perhaps I’m writing fantasy now and I should move on.

Ah, no not another Top 100 list

Blackpool IluminationsThis post is going to sound like I’m having a bad day, but I’m not. But I am fed-up with Top 100 lists. It seems to be all we get on the TV these days and now I have to have all of my time and bandwidth wasted by another blog one – this time from CNet. As I subscribe to a number of these sites, I have today had to endure endless ‘thank-you and ’I’m honoured’.

Please no more. In the same way that I would rather my TV told me something useful, rather than repackaged the same stuff I’ve seen a 100 times before I’d rather these top 100 blogs told me something constructive rather than pointing to someone repackaging their content. Come on guys you are better than that. We are busy people and each one of these posts contributed to the (as Arthur C Clarke put it) “World-Wide-Wait”. Feeds a great for deciding who I want to listen to, this type of thing reduces the value. I’m only ranting because this isn’t the first time it’s happened. I really do not care where you have been ranked – I care about the content you provide me.

PS – these guys deserve to be noticed, it’s the cheap journalism that I hate.

Where is the real news?

On Off

This weeks ‘announcement’ from Sun and Google has got me thinking. Why is the IT industry so fixated with announcements? This announcement changes little, not because of its content but because announcements never change anything. Announcements simply indicate an intent t change something, and those changes are always incremental. Even if Sun and Google had announced a web based version of Office or whatever the big rumour was, it would still have been incremental, and for most people the increment would have been quite small. Within the industry we still like to foster this idea that we are radical and thrusting changing things overnight, but it just doesn’t work that way. Even the big things aren’t that big. Skype is big, in terms of numbers of people, but in terms of time used, is still very small for most people. Firefox is bog, in terms of numbers of downloads, but it’s still only a small percentage. RSS is big, but it’s about increments again. We have to start realising that we are not creating revolutions, we are creating incremental change, everywhere. Over time those increments build up to make significant changes, but a single announcement is only an increment.

It was while I was having these thoughts that the Read-Once DVD rumour/scam/hoax started. It kind of proved the point. More interested in hype than substance. Come on people, think about it, why would anyone want to buy a DVD that you can only read once, and how on earth could it be cheaper than ne that you can read more than once.

Here’s some real news for you. Software is changing every day – live with it. But the changes are incremental – live with it.

Vista Vista

What a Desk

I try to keep my desk environment nice and tidy. I don’t see any point in having everything in front of you cluttering up your view. Today is a bit cluttered though. I have installed Vista on my Tablet and for the short term that means more than one device on my desk. And because I’m so used to a keyboard, I’m not using the pen interface to build it.

First impressions are very good. I don’t get the eye candy of glass (if I want the box to be usable) because the graphics card isn’t up to it.

Most interesting thing for me is the impact on end-user (me) productivity.

(Must work out which one of the kids it is that leaves so many pens on my desk)

IT Marketing Does it Again

Close It

I would hate to try and count the number of times when IT marketing has left my emotions completely messed up. I watched Dell’s new marketing campaign this morning and wasn’t sure whether to be angry, physically sick, throw something at the screen, say ‘what’, laugh at the irony of it all, sit in amazement that other companies also thought it was a good idea or to marvel at the fact that a modern IT company chose to use very old puppetry techniques.

http://www.delltechforce.com/

The great thing is that they are using ‘big iron’s’ servers to host the site; at least that’s who I assume that the dig it aimed at.

Blog Succeed where Newsletters Fail

Water

This is purely a personal perception which I have not had chance to investigate too much but it’s a view that may resonate with others.

I work for an organisation that has not yet embraced blogs internally, but does do quite a lot with newsletters. I rarely read these newsletters, and I know that others are similar. I take in far more information through blogs that I ever do through newsletters. So why is that?

Some of it, I am sure, is related to to a lack of concentration of my behalf. I have become the ultimate skim reader. If the title or the context don’t make we want to read – I won’t. Skim reading newsletters is not easy. They are normally created in a form that assumes that they will be printed off, this doesn’t facilitate skim reading. I tend to skim read because most of the time I don’t need to know a piece of information, it’s more important for me to know it exists and that I can get hold of it quickly. That’s where blogs have a huge advantage. In my reader I can see that thousands of bits of information exist, when I need them I can go and get them. I know that the information exists because I have skim read through. If something new and pertinent comes up I’ll read it there and then but normally I’m in skimming mode. Why should I waste my time reading something in detail?

Another reason is similar to this one, but subtly different. An individual blogs tend to deal (if they are done right) with a single subject. Newsletters tend to deal with a multitude of things. Finding the quality in all of the words is very difficult (and boring).

The final reason (for me) is that there is a sense of control with blogs which corporate newsletters don’t have. I have configured my reader to go and get information from this particular source, I am in control. Compare that to my normal attitude to newsletters – “oh no, what have communications set me now”. The ownership is completely different. Yes I know these communications people are trying to do me a favour, but it doesn’t feel like it.

So give me a feed any day, don’t bother sending me a newsletter, and definitely don’t give me another repository to look in.

Service Entropy in IT Systems

Heat On

A while back (probably years now) Steve introduced me to the concept of ‘service entropy’. By this he was referring to this definition of entropy: ‘The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity’. Both Steve and I are from an engineering background where entropy is evident in real things. Steve’s theory was that IT services operate the same way and drive down towards the ‘lowest common denominator’. If you heat something up, it doesn’t matter how hot, it will always cool down to the level of everything around it, unless you keep heating it. If we view an IT project as a heating process and the resulting service or application as the thing being heated up, what happens after the project has finished is the entropy process.

As I have pondered this and how it impacts IT infrastructures I have seen it at work all over the place. As with the heating process, it starts before the project has even finished. As soon as the project is exposed to the outside world it is already giving off heat. As soon as a project starts to be exposed to the cold light of day the ‘heat’ is leaving it. I’m using ‘heat’ here as a metaphor for the changes that the project is seeking to make in the environment in which it is being delivered.

As an architect who has a vision for a certain amount of ‘heat’ (change) to be produced by a project it is a huge challenge to consider how you can insulate the results of the project from entropy. Because entropy dissipates the energy within the ‘closed system’ the larger the ‘closed system’ the more energy you have to put in. In the context of an IT ‘closed system’ the extent of the change (heat) may be the whole organisation or may be a smaller group. In a ‘heat’ system it’s the insulation that makes it ‘closed’, for an IT system it’s probably the organisational construct that replaces the insulation and is the thing that keeps the ‘heat’ in (you have to remember though, that there is no perfect insulation)

Entropy explains why it is easier to impact a small organisation in a big way and to make a permanent change. Entropy explains why it is very difficult to make a small change in a large organisation and to make it stick. Entropy also explains why a large change in a large organisation is practically impossible without massive amounts of ‘heat’. I tend to be involved in making medium sized changes in large companies where organisations presume that because the change is ‘not huge’ that it should be easy. Most of the time these projects result in deploying technology and some process change, but never make the productivity increases that were expected. At this point it is generally the technology that is blamed rather than the lack of ‘heat’. It’s not normally the technology itself that generates the ‘heat’ though, it’s more to do with the way that the technology generates change and community.

The other thing with large systems is that it is impossible to apply the ‘heat’ uniformly. Some people will get direct ‘heat’ from the project, others will get it transmitted to them from another. Some people are conductors and some people are insulators. In some ways this lack of uniformity is worse than having a small amount of ‘heat’ everywhere. The reason for this is that when ‘hot’ meets ‘cold’ you generally get a reaction that cools down the ‘hot’ as much as it warms the ‘cold’. Take the example of calendaring and scheduling functions. These functions have been available for years (decades even) in corporate email services. Some organisation have managed to make a transition to making them uniform, but many organisations have not. As soon as there is any doubt (cold) about whether someone uses their calendar the value of all calendars is reduced.

There is a great danger in taking this metaphor to far, but I just want to take it one step further. You can undertake the heating process in one of two ways. You can either heat a small area to a very high temperature and hope that it will reach the edges of the system; or you can heat everything in a uniform way to the same temperature. The Internet has shown us that for large systems heating up a small area to a very high temperature works better than trying to heat up the whole. Take the example of Google Earth, this is a very ‘hot’ piece of technology and has generated a huge community. A few connected with that ‘heat’ and bit by bit the ‘heat’ is distributed. The trick is to keep the ‘heat’ going at the centre, which is what they have just done with the integration of National Geographic data. As a system becomes older it is more difficult to keep the heat going at the centre, and that is the challenge for Microsoft with Office 12 and Vista.

So how as architects do we resolve the entropy problem. For starters we spend as much time and attention on the insulation surrounding our heat source as we do on the heat source itself. In other words we try to understand where the heat will be taken out of the project and insulate the project from it. The other thing we do it to try and generate as much heat as possible knowing that some it will be dissipated.