tag:blogger.com,1999:blog-89669283439558708302024-02-06T21:06:50.860-08:00Tales of the IT HelpdeskTony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.comBlogger63125tag:blogger.com,1999:blog-8966928343955870830.post-82671617059591066392011-12-13T00:14:00.000-08:002011-12-13T00:37:14.247-08:00Microsoft IT Camp<div>I was invited to attend a Microsoft test event on Monday 12th December. The Technet staff were trialling a new format of training session and wanted to get some feedback on the format from people within IT, and how people felt it would work if rolled out as part of Microsoft's normal training material. The session was held at Cardinal Place in London; a great venue, very modern with superb facilities but as I’m based down in the South West, this was a long way to travel.<br /><br />The event was opened by a rather hoarse Simon May who left a lot of the talking to Andrew Fryer. The basic idea was to showcase the updated versions of the System Center products, with a specific emphasis on Virtualisation, making use of Hyper-V. However, they also wanted to focus particularly on the setting up of clusters. I’d seen some previous material on the earlier versions of these products, but was keen to see the 2012 versions due out next year.<br /><br />For those that don't already know, there has been a move towards much more integration of the various products within the System Center range. Each product is now seen more as an integral part of the overall suite, rather than as a separate product that just happens to work with the others. This seems to a sensible move and it means that sysadmins should have access to all of the tools they need to manage their data centres.<br /><br />Rather than use high specification equipment, Andrew wanted to demonstrate that it was possible to set-up a test lab using older machines; the sort that can be obtained using ebay or that might be sold off after an equipment refresh. He had several laptops; 2 acting as the Hyper-V hosts and one that was acting as a type of SAN unit. He proposed to join the 2 hyper-V hosts as clusters on a single node.<br /><br />The presentation did not go quite as planned! He actually ran into several key issues during the set-up, but as many of the people present were very familiar with the product, they were able to highlight a number of the factors that had caused the hiccups. What was interesting was that even with these technical issues, the whole process didn’t actually take that long.<br /><br />During the day, and also at the end, the staff asked for feedback on the event which it has to be said was generally positive. However, quite a few people (myself included) felt that they had missed a trick; many of us had our laptops with us, and it would have been a really impressive feat to have got these working as part of the set-up as well. There was a general feeling that most delegates would have been more than willing to bring their own equipment, possibly even downloading and installing some items in advance in order to make this more effective.<br /><br />Having said, they were more than willing to consider this and a couple of other ideas that might allow those present to take a slightly more active and positive role. I’ve seen a couple of VDI infrastructure plans, and I feel that they would easily be able to set-up something that could be used for attendees to connect to and work with VMs in order that they could get a real “hands on” experience.<br /><br />The plans are for the new format to be modified, based partly on experience but also on the feedback from those that were there. They also hope to develop it further to encompass more topics, and the organisers were keen to get feedback on which ones were of the most interest. Some comments were made about making sure that any future events would be held in other locations; the Microsoft offices are great, but not everyone can get there easily. Although there were no commitments, it seems that they intend to try to cover more of the major population centres than before; and that can’t be a bad thing!<br /><br />I have to be honest I do enjoy these sorts of events. I feel quite strongly that those of us that work in IT can all too easily develop a “silo mentality”. We get so wound up with day to day problems, and all too often work in small groups, and it’s far too easy to forget about the bigger picture. This can also make the job less enjoyable; it’s just too easy to find the passion for the work drifting away. By going along to the various sessions, it’s possible to see new ways of working that might otherwise pass us by, to meet with other professionals and hear what problems they face. I find that it can help generate a new enthusiasm for the work that can all too easily be lost when you are dealing with very basic problems most of the time.<br /><br />All in all, I found it to be an interesting, useful, enjoyable day. I suspect that future events will be along the same lines, but will benefit from the comments of those that have taken part so far. If you see one in your area, I would urge you to go along; it will most definitely be worth the time and effort.</div>Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-2149720914551685402011-12-05T07:53:00.000-08:002011-12-13T05:37:11.706-08:00Jeux sans FrontiereBack in the 1970s & 1980s, there was a TV programme called “It’s a Knockout”. This featured teams of people from across the UK competing in a series of increasingly silly games. These programmes were presented by the wonderful Eddie Waring and Stuart Hall; and anyone who watched, will remember the way that Stuart used to collapse in fits of uncontrolled laughter at the various antics.<br /><br />The format was so successful that it spawned an international contest “Jeux San Frontieres” (Games without Frontiers), and towns from across Western Europe would take part, host these crazy contests. It was a lot of fun, and sometimes I wonder if it wouldn’t be a good idea to resurrect the concept.<br /><br />I mention this because it’s clear that there are a lot of companies in the SME market that are now having to deal with cross border relationships; even quite small businesses like ours are able to sell to other countries thanks to the power of the Internet. In our case, we have offices in other countries, and there is a need for our IT staff to support users in those countries as well as in the UK.<br /><br />This is not easy. I now have an enormous respect for those support people in call centres that provide facilities for multi-language telephone support. Bearing in mind that my French was learned in school some 40 years ago, and was of the “Ou est la plume de ma Tante?” method of teaching, I was quite nervous to have to try and deal with potentially complex technical issues in another language.<br /><br />Part of the problem is having the confidence to try to speak in another language, particularly if you don’t do this regularly. If you mutter something in an embarrassed way, and the other person then responds with an impatient “Quoi?”, then it’s easy to get nervous and that just makes things harder. However, it’s surprising just how much you can communicate with a relatively small vocabulary and if you speak confidently.<br /><br />Try this; think of a phrase or sentence at least a couple of dozen words long. Now write out every third word on a piece of paper and give that to someone to read. The chances are that they will still understand what you mean with only the few words selected. There have been numerous studies and this has been proven to work in almost all cases (and not just in English), even when using complex phrases. It’s not necessary to get grammar or syntax absolutely correct, as long as you use the appropriate words. We just had to learn the right phrases, and be able to use them appropriately.<br /><br />About a year ago, I bought an older server specifically to support a virtual platform. Then I used the Technet site to obtain copies of Operating Systems in the relevant languages we have to support. Although the configuration process and screens will be the same, it’s helpful to know some of the differences in technical names; for example, in French “Computer” is "Ordinateur"<ordinateur>, but “My Computer” is "Poste de Travail"<poste de="" travail="">;. Getting the correct phrase is not just a case of a direct translation!<br /><br />This has helped enormously, and I can confidently tell people on the phone to “Clicquez-vous en Demarrer"<demarrer>, "Aide et Support"<aide et="" support="">, "Assistance a Distance"<assistance a="" distance=""><invitez un="" ami="" se="" connecteur="">…” etc. etc. It’s also allowed us to take screen shots of the various windows with the appropriate language text in French, German and Hungarian, and these are used to create user documentation for inclusion in a FAQ section of our help desk software.<br /><br />The end result is a better service for the end users. It makes them feel more confident in the support that we provide, and we have had some really good feedback from their staff. It also means that our support staff (i.e. me!) can feel a bit more comfortable when the dreaded “34” country code appears on the CLI of the incoming call.<ordinateur><poste de="" travail=""><demarrer><aide et="" support=""><assistance a="" distance=""><invitez un="" ami="" se="" connecteur=""></invitez></assistance></aide></demarrer></poste></ordinateur></invitez></assistance></aide></demarrer></poste></ordinateur>Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-65056134884033325382011-11-17T02:30:00.000-08:002011-11-17T02:33:02.016-08:00Video Conferencing<div>This is a topic that crops up from time to time; and it’s one that I have some experience with.<br /><br />A decade or so ago, people were selling Video Conference (VC) equipment for use with ISDN lines; these were OK, but there were technical issues with the data stream bandwidth and Quality of Service, and the user experience could be less than satisfactory. Pictures would be blocky or pixelated and even audio could be a bit of an issue, especially with multi way calling.<br /><br />But the benefits to the business were really valuable, so people tolerated poor quality. Even if we only had a couple of VC meetings every week, the cost savings were very significant to the company that I worked for. At that time (2000 – 2004), we had calculated that we were saving around £25k to £35K per year. This was based wholly upon petrol / mileage costs saved with the sites about 200 miles apart.<br /><br />When it became possible to use IP based systems, the quality of both audio and video improved quite considerably as the compression ratios were better and bandwidth higher and more consistent; and the user experience was such that people actually wanted to use the facility. I put this in at my current employer at all company sites, and I’ve estimated that we have saved around £450k to £500k over the last 6 years (for a capex of £25k and very little opex). This is based upon petrol / mileage / flights, hotel accommodation and subsistence allowances that would otherwise have had to be paid for.<br /><br />This of course does not take into account the less tangible benefits; work / life balance (less travelling, fewer later nights), carbon footprint / environmental costs, user interaction. We found that most staff were able to collaborate better with VC meetings, and this generated some useful ideas which lead to key improvements in many areas. This also helps staff (and even some managers) feel more engaged within the activities of the business.<br /><br />It’s become so valuable that we are now seeing senior managers wanting access to a VC function on their desks. We provide this capability through units which look like PC monitors, but can be switched to VC screens. We have experimented with smaller products; Skype, OCS / Lync and others, but the managers do like the larger viewing screen and it’s difficult to persuade them to use smaller viewing windows.<br /><br />I think that almost inevitably, we will be moving to Tele Presence at some stage; once they see the improved quality of the product, I suspect they will be demanding it instantly. I’ve seen and think that it is pretty awesome; if you haven’t had the chance, then call your regional supplier, as they will be delighted to demonstrate their offering. Our current equipment is still functioning well, but has more than paid for its installation so replacing it would not be too much of an issue. The costs for purchasing the new hardware are a bit higher, but considering the cost savings, it would be well worth it.</div>Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-77785085855884552182011-11-04T07:54:00.000-07:002011-11-04T07:57:07.004-07:00A-PDF Watermark Service is one of the best tools I have come across ..For some time now, we have had a bit of a technical challenge within our Technical Drawing office. These guys produce about 2,000 to 3,000 different engineering drawings a week, all of which have to be saved and then accessed by a large number of people within the factory as well as others throughout the different sites belonging to the business.<br /><br />We have a Document Management System that allows us to link the drawings to various modules within our ERP software; this is really useful as part of a drive towards using less paper throughout the business. This can be useful, but only if the file is attached to the right item straight away; and often that isn’t possible for a number of technical reasons.<br /><br />The problem is that when you get that number of files, there is a key issue. How do you identify the right drawing and associate this to the file? We have tried a number of different methods with file names etc. but this doesn’t always help. Imagine that you have the printed drawing; it says that it is a left handed swivel arm, but how do you know what file that drawing came from if you want a second copy?<br /><br />After some discussion, we decided that what we needed was a simple tool to allow us to imprint a modified file name onto the drawings which included works order number, quantity, and required date of the component. This would then allow anyone looking at the drawing to identify exactly what file the drawing came from and they could then quickly locate the relevant file and the also know where to look within the ERP system.<br /><br />After some considerable research we found A-PDF Watermark Service from A-PDF. This useful little tool allows us to add those details of the drawing’s file name onto a designated place on each drawing; and it does so automatically.<br /><br /><a href="http://www.a-pdf.com/watermark-service/download.htm">http://www.a-pdf.com/watermark-service/download.htm</a><br /><br />Using this product meant that we saved the time in hand writing (or typing) the information onto the drawings and it also removes the element of human error. It’s installed on the relevant file server and runs as a background service that processes the files automatically; and seems to easily handle the work load that we are throwing at it.<br /><br />We highly recommend this product as a simple but effective solution.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com1tag:blogger.com,1999:blog-8966928343955870830.post-40528601918780922892011-06-27T08:19:00.000-07:002011-06-27T08:22:10.641-07:00Office365 (part 2)After my last post about Office365, I thought that I would write a bit more about why I think it would be such a good product for us; the rationale behind the thinking.<br /><br />Some 10 years ago, less than half of the office staff had PCs, and there were perhaps 2 PCs in the factory area. Now, everyone in the offices has a PC (some have more than one) and in the factory areas, there are just over 2 PCs for every 5 staff. (These are shared by people and used as required to access relevant data.) As you can see, there has been a significant growth in the use of IT systems in the last decade.<br /><br />About 6 years ago, some people start working with laptops and they were able to use VPN connections to get access to systems in the office, primarily for email when they were off site. To start with these were senior managers, IT staff and some sales people, but over the last couple of years, the number has increased to include many others. We even have a couple of ladies from our customer support team that regularly go out to visit partner companies that they work with, and they take a ?pool? laptop with them.<br /><br />As you?ll realise, having access to email, CRM & ERP systems along with data files is pretty important for many of these staff and it helps them do their job far more efficiently. However, although the process to connect the VPN is really easy, some of them still occasionally have difficulties in making the VPN connections and we have been looking to see if there is a way to make their life easier.<br /><br />One thing that was discussed in the Microsoft ?Jump Start? sessions a few weeks ago was the concept of a ?Hybrid? cloud; one that used both public and private cloud options linked together. In the session, there was a discussion about linking Office365 using LDAP to connect to an existing Exchange Server inside of a company?s LAN. Effectively, this would extend the mail function to allow Active Directory designated people when outside of the network to use Office365, and staff inside to use the normal Exchange Server; but the two linked together effectively as a single system and without the need for VPN connections.<br /><br />I think that this could be a major benefit for us; it would make life easier for all staff that travel, as they would have access to their email without having to worry about running VPN connections. They could use their laptops, their smart phones, tablets or even a PC from the people that they are visiting to get access to their mail and other material. <br /><br />As for staff inside of the business, they would continue to use the existing Exchange mailboxes; but they would still see the travelling staff as being on the same system. It might even be an option for some of the staff internally to use a tablet moving around inside of the factory; although I?m not sure that these devices are quite robust enough for some of the heavy handed individuals we employ!<br /><br />Of course, there are security issues, but that is for a discussion another time. I feel that the hybrid option would make a lot of sense for us; it would provide a sensible and elegant solution to a problem that has caused a few issues and will only get more serious as time goes by. I think that Office365 is a product that deserves some serious consideration and could provide a real option for our travelling staff; and it might be a real advantage to the business.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com1tag:blogger.com,1999:blog-8966928343955870830.post-84232630784989591842011-06-13T04:26:00.000-07:002011-06-13T04:27:16.001-07:00Office365As promised in my last post, I’m going to write about the new Office365 product for which I have been testing the beta version. If you want to take a look at the beta for yourselves, then sign up here:<br /><br />http://www.microsoft.com/en-gb/office365/enterprise/hosted-software.aspx?CR_CC=200038628&WT.srch=1&CR_SCC=200038628<br /><br />Essentially, the concept is simple; this is an online product that provides the functionality of the normal Microsoft Office package. It’s run through a browser window, and the key thing is that it can be accessed from any device at anytime. All you need is a standard Windows Live ID in order to get access to the relevant portal.<br /><br />The front page is quite straight forward and very “clean” and uncluttered; it gives a brief overview of what tasks need to be done and how to access the key components. There is also a link to support, the community forum, and information on how to perform key tasks.<br /><br />The Outlook function is accessed from a menu item and is based upon Outlook 2010; even if you are using an earlier version, you will probably be able to work out how to do things. I tried this on an iPhone and there is a slight difference in the appearance as it uses the Outlook Web App (OWA). For those advanced Outlook users, there are a couple of functions missing; the public folders option is one item. However, I found it really easy to use, and I suspect that most others would have no trouble switching to this product from an existing version of Outlook.<br /><br />There is also the calendar, contact list and tasks list functions as well. We use this on our normal Outlook function, so it might be something that we could use to good effect. For the contacts, we would need to find a way to separate out some of the entries as otherwise we would end up with massive longs lists making it harder for people to find what they need.<br /><br />The Office365 product includes SharePoint Online; which is exactly what it sounds like. It seems to be based upon the SharePoint Foundation product, and offers the same kind of functionality. Although I had a few issues with the provisioning at first, an email to the Support Centre fixed that. I then found it really simple to set-up and use.<br /><br />I’m actually a great believer in SharePoint; I think that it has a lot of functionality that would help fix a lot of business issues and provide a mechanism for resolving several key communication problems. The only downside is that it sometimes seems very difficult to get the users to understand that they can take control of many of their tasks; they seem to have a very fixed view that only IT staff can do these things.<br /><br />Office365 also offers Lync which is the new instant messaging client; I thought it looked very slick and had a number of very useful additions compared to earlier products. Again this is something that I think we don’t make enough use of, and following a couple of tests, there are some key users that really like the product, but unfortunately there are many more that simply do not want to even try to use it.<br /><br />Lync can also be used for audio or video conferencing; I did one very quick test and it worked well, but that was only between 2 users within our network. It would have been useful to test it against a couple more users for a slightly larger conference call; we may still do that another time as we still have over 140 days left on our 6 month beta licence.<br /><br />The other main feature are the Web Apps for Word, Excel, PowerPoint and OneNote; very similar to the 2010 versions of the software and most people will pick them up very quickly. I’m not a great fan of the ribbon interface, but I suppose that I’ve become used to using it; and the Web Apps use the same feature, so it make sense to get used to it now.<br /><br />There are a number of arguments about the use of cloud computing; that’s going to be a topic for another day. Suffice it to say that having tested Office 365, I really like it and most other users seem to find it very straight forward. We don’t know the price yet, but I have seen a couple of suggestions for the cost, and I think that it could be very affordable.<br /><br />Office 365 is a really good product even though it is just the beta version so far; it’s one that I’ll be keeping an eye on over the next few months for sure.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-33684320034590041172011-05-31T08:47:00.000-07:002011-06-01T03:23:16.092-07:00Office 365 Jump start sessionsLast week, I had the opportunity to take part in 3 training events organised around the new Microsoft Office365 product; the replacement for BPOS. These sessions were all online, run using MS Live meeting, with a mixture of PowerPoint slides and some actual demos of the product in use.<br /><br />The sessions were started at 10.00 am Pacific Daylight time (18.00 BST) as they were being hosted from the West Coast of the USA. They ran until 4.00 pm PDT which meant staying up until midnight, very much a long evening, particularly as I usually get up at 6.30 in the morning. However as the event was so worthwhile, I don’t feel too put out by that.<br /><br /><strong>(<a href="http://blogs.technet.com/b/uktechnet/archive/2011/05/11/register-now-for-the-office-365-jump-start-for-it-pros.aspx">http://blogs.technet.com/b/uktechnet/archive/2011/05/11/register-now-for-the-office-365-jump-start-for-it-pros.aspx</a>)</strong><br /><br />On the first day, they had a few technical issues with the audio at the very beginning of the session; for some reason, they kept losing the sound from the presenters. However, once that little hiccup was out of the way, the sessions picked up pace quite rapidly and they went through a great deal of information.<br /><br />The moderator was Adam Carter who kept things moving along really nicely; he was joined by a number of people that had specific knowledge of key components of the package and these went into the various parts in some detail. At the same time, the online participants were invited to ask any questions; there were some really great issues raised and for the most part, the moderators were able to deal with these or to pass them on to the specialists for them to elaborate further.<br /><br />I’ll write up a bit more about the actual product itself in a later blog post; suffice to say that the various components were explained and demonstrated very well. I would suggest most people had a really good opportunity to see them in action, learn a bit more about some of the basic administrative tasks required, and how to make use of the new product.<br /><br />Of particular note was the session on using PowerShell to do some of the admin tasks; for those that are not so confident in using this utility or still working out if they need to use it, the demonstration showed just how flexible and easy to use it is, and I’m sure that many would have gone away determined to learn more about working with the commandlets.<br /><br />For me the best demonstration was by Mark Kashman who gave a superb presentation on the use of SharePoint Online. He had created a demonstration site using the “Fabrikam” company name, and it was quite astonishing; simply one of the best SharePoint sites I’ve seen. A number of people asked if it would be made publically accessible as a reference site, and he has said they will look at this, but he felt that the site was still unfinished and that the team would want to do more work on it before releasing it into the wild.<br /><br />All in all, this was a really great opportunity to learn more about the new Office365 product. It was very well put together and I think pitched at just the right level for most of the people involved. The slides are now available online to download –<br /><br /><strong><a href="http://borntolearn.mslearn.net/office365/m/officecrapril/default.aspx">http://borntolearn.mslearn.net/office365/m/officecrapril/default.aspx</a></strong><br /><br />They did suggest that the videos will also be available in a couple of weeks’ time, and if I get the details, I’ll add them on as well. They also promoted the Microsoft Virtual Academy, another really great free resource; if you haven’t heard about this, check it out at<br /><br /><strong><a href="http://www.microsoftvirtualacademy.com/Home.aspx">http://www.microsoftvirtualacademy.com/Home.aspx</a></strong><br /><br />I hope that Microsoft put out a few more sessions like the jumpstart session; I would suggest that if they do, you would be well advised to sign up as it is a great training resource for IT sysadmins, to make that they stay on top of the latest products and developments.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-65753111302752606842011-04-25T02:35:00.001-07:002011-04-25T02:38:44.481-07:00DPM across DomainsI've been using the Microsoft System Center Data Protection Manager product for just under 4 years; and I really like the product. As far as I am concerned, it ticks a load of the relevant boxes; easy to install, easy to use and manage, and most importantly, it works really well. It backs up to disk, then from disk to tape. It uses a relatively small amount of bandwidth and data recovery is quick and easy. It is simply one of the best backup products that I have come across, far easier to use than many of the more well known software packages.<br /><br />A while ago, the company bought out a partner organisation. This left us a sales office based in Paris; they are a separate entity, but as they are quite small, they don't have their own IT staff. They have been using the services of another business, but it was decided a while ago that we would take on that responsibility. We needed to provide a backup function to preserve their data, and set about putting this into place.<br /><br />One of the key issues was that they did not have an Active Directory domain on site. Everything was set-up as a workgroup only, and this causes a lot of issues. So one of the first things to do was set-up a suitable domain structure. Hopefully, this will reduce the amount of admin work that is required; previously, it was necessary to create a local user account on every single piece of kit, which required a lot of work. The new domain was created a couple of weeks ago, and we've now also created a two trust between the two AD domains. <br /><br />The next step was to set-up the remote site to be backed up by our DPM server, but this was where we hit a snag. Each time we tried to install the agent, it responded with messages that the remote site was not available. I could prove that this was false; I could ping the remote server and even RDP to it from the DPM server. I checked all sorts of things, and each showed that the remote site was fully operational and accesible.<br /><br />So I decided to do a manual install of the agent on the remote site. The first step was to RDP to the remote server, then create a mapped drive back to the DPM server. Having done that, I then opened the folder where the DPMAgentInstaller.exe file was found - that's at \Program Files\Microsoft DPM\DPM\Agents\RA\<version number>\i386 and there are also options for AMD & 64 bit installs.<br /><br />This actually went through OK, and having installed the agent, it's necessary to define which is the correct DPM server. This is done using \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName <DPM server name>. Again this went through OK, but it still produced an error message that there were insufficient permissions to complete the process.<br /><br />After having checked the event log, I was able to see a number of LsaSrv Event ID: 6033 errors. This showed that I should modify the registry key \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName <DPM server name> to disable the anonymous logon block. Having done this, it then showed another set of errors taht indicated that there was still a problem with permissions. <br /><br />Having checked these yet again, I could see that the DPM server was in the correct groups etc. but I also thought to put the DPM administrator account into the administrators group account. Having done this, the error went away, but the agent still wouldn't connect to the DPM server. However, I ran the SetDPMServer.exe utiltiy again, and this time, it completed correctly. When I went back to the DPM console, it showed the agent as installed and connecting to the remote server.<br /><br />So now we are in the position where we can actually backup that remote site. It will be a bit of an issue to begin with as there is a lot of data on site. I'll probably go over again, to do a manual copy of the data to a portable hard drive. This can then be manually copied to the DPM server to get the initial data load, and then the synchronisation process will only work on the data that has changed from that copy; a great deal less than the full synch process.<br /><br />This is going to make a huge difference to the people on the remote site; they won't have to worry about tapes etc. or what to do if someone goes on holiday. The data is being backed up off site, so is more secure. The recovery process is really simple and we can give them the confidence that we can deal with it really quickly if needed.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-76982397436062133432011-03-02T03:11:00.000-08:002011-03-02T03:33:25.433-08:00Transformational SecurityA couple of weeks ago, I attended an event hosted by Computer Weekly, SC Magazine and a couple of others. “Information Security Leaders 2011: Transformational Security” - as you might gather form the title it was a look at how and why things are changing and how to provide security in the newer IT landscapes.<br /><br />Although a lot of people think that these are just junkets, with a chance to pick up some SWAG and eat and drink at someone elses' expense, I actually find these events very useful. Working within IT can have its problems; all too often, we work in small groups, and it's very easy to become isolated. This means that we develop set habits, and forget that there may be other ways of doing things.<br /><br />Getting out to events like this can be really useful in many ways. It's interesting to talk to others in the industry and see just what kinds of problems they are facing. All too often, we might think that we are the only ones with a particular issue, only to find many other people with exactly the same problem. I really like to share advice and information on how we approach some of these and how and why we go down the route that we do.<br /><br />This particular event was very useful. There were some keynote speakers that offered a real insight into just how things are changing and why; and they offered some considered advice on how to look at this as an opportunity. In particular, the concept of "consumerisation" was raised - people wanting to use their own equipment that they use for home based email, social networks etc, then wanting to use the same items for work use. (That's not just the same make or model, but the actual device). <br /><br />At first, I thought that this was not an issue that we would face; but then I realised that it has already happened. We have a number of staff that have their own mobile phone (smartphone device) that are then trying to connect up so that they can get their email on the device. It's not been a major issue so far; but what would we do if one of those people then left the company? (OK, cancel their email account is a start, but what if they had access to someone else's account as well?) <br /><br />Or how would you react if they lost their mobile device and someone else found it and then could use this to get access to company systems. The answers may seem simple, but as the speakers pointed out, this is the thin end of the wedge, and it's going to start happening a lot more often and involve a lot more devices and people.<br /><br />All in all, the event was a good day (and yes the food was good!); it was also very useful from the point of view of getting people to think slightly outside of their comfort zone. If there are any more events of this type, either this year or in the future, I would strongly recommend thaking the opportunity to get along. You won't regret it!Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-55711361567028850792011-02-02T01:37:00.000-08:002011-02-02T02:03:52.587-08:00Email signaturesSome time ago, it was suggested that we should have an agreed format for Email signatures across the company. Unfortunately, it took some time to get agreement on what format we should use. I could go into the details of this, but it's pretty boring; for example, the discussions on the font to be used seemed to take forever. Suffice to say that there were numerous discussions and it has taken quite a while for the final decision. <br /><br />There are numerous sample VB scripts out on the Internet for producing an email signature, but none seemed to achieve what we wanted. I did think about trying PowerShell, but I don't yet know enough to be able to do the work using that. As I've used VB script on and off for a few years, it made sense to try and use that, at least for the time being.<br /><br />The script has taken a little while to put together to make sure that it meets the needs of the business. It takes data from the Active Directory, formats it and places it in the required location. It also inserts a company logo, and there is a bit of conditional text to insert other logos; this is because we attend a number of trade shows, and like to promote these on our emails. <br /><br />There is one slight issue; the email has to go out in Rich Text Format. If it goes as .HTML, the lines get double spaced. This is just because of the way that it gets rendered and I haven't found a way around this. Also if it goes out as plain text, the logo doesn't get inserted. It works by using Word - it extracts the AD data and sets the sig in Word before saving it in Outlook. <br /><br />I'm putting the script below as I am quite pleased with it and the results; if it would be of any help, please feel free to make use of it. Just copy the text, place in a text file, save it and then change the extension to .vbs - I haven't tested it with all versions of software, but I have tried with Outlook 2003 / 2007 / 2010 on Exchange 2003 (on Server 2003), on PCs running Windows XP and Windows 7 and it worked in each case.<br /><br />(Note that I have removed the specific details of our company so that it is a generic script; you would then have to modify it to show your own details.)<br /><br />Enjoy!<br /><br />====================<br /><br />On Error Resume Next<br /><br />Set objSysInfo = CreateObject("ADSystemInfo")<br /><br />strUser = objSysInfo.UserName<br />Set objUser = GetObject("LDAP://" & strUser)<br /><br />strName = objUser.FullName<br />strTitle = objUser.Title<br />strDepartment = objUser.Department<br />strCompany = objUser.Company<br />strOffice = objUser.physicalDeliveryOfficeName<br />strPhone = objUser.telephoneNumber<br />strFax = objUser.faxNumber<br />strMob = objUser.Mobile<br />strAddrs1 = "Site 1 Address"<br />strAddrs2 = "Site 2 Address"<br />strAddrs3 = "Site 3 Address"<br />strWeb = "www.domain.net"<br />Logo = "\\server\share\logo.jpg"<br />ShowLogo = "\\server\share\show1.jpg"<br /><br />Set objWord = CreateObject("Word.Application")<br /><br />Set objDoc = objWord.Documents.Add()<br />Set objSelection = objWord.Selection<br /><br />Set objEmailOptions = objWord.EmailOptions<br />Set objSignatureObject = objEmailOptions.EmailSignature<br /><br />Set objSignatureEntries = objSignatureObject.EmailSignatureEntries<br /><br />objSelection.Font.Name = "Arial" <br />objSelection.Font.Size = "10" <br /><br />objSelection.InlineShapes.AddPicture(Logo)<br />objSelection.TypeParagraph()<br />objSelection.TypeParagraph()<br /><br />objSelection.TypeText strName & ", " & strTitle & Chr(10)<br />objSelection.TypeText strDepartment & ", " & strCompany & ", " & strOffice & Chr(10)<br />if strOffice = "Site1" then<br /> objSelection.TypeText strAddrs1<br />end if<br />if strOffice = "Site2" then<br /> objSelection.TypeText strAddrs2<br />end if<br />if strOffice = "Site3" then<br /> objSelection.TypeText strAddrs3<br />end if<br />objSelection.TypeText strOffAddrs & Chr(10)<br />objSelection.TypeText "Tel:" & " " & strPhone & Chr(10)<br />objSelection.TypeText "Fax:" & " " & strFax & Chr(10)<br /><br />if strMob <> "" then<br /> objSelection.TypeText "Mob:" & " " & strMob<br />end if<br /><br />objSelection.TypeParagraph()<br />objSelection.TypeText strWeb & Chr(13)<br />objSelection.TypeParagraph()<br />objSelection.TypeParagraph()<br /><br />if strOffice = "Site1" then<br /> <br />end if<br />if strOffice = "Site2" then<br /> objSelection.InlineShapes.AddPicture(ShowLogo)<br />end if<br />if strOffice = "Site3" then<br /> <br />end if<br /><br /><br />Set objSelection = objDoc.Range()<br /><br />objSignatureEntries.Add "AD Signature", objSelection<br />objSignatureObject.NewMessageSignature = "AD Signature"<br /><br />objDoc.Saved = True<br />objWord.QuitTony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com2tag:blogger.com,1999:blog-8966928343955870830.post-55472554326502856102011-01-04T07:57:00.000-08:002011-01-05T02:10:43.799-08:00A third slice of VHappy new year to one and all!<br /><br />This post follows on from a previous item on virtualisation. We had installed the hardware, then the ESXi software - now to start getting serious.<br /><br />ESXi does have a console to set-up certain key items, but these are very limited. Essentially, it allows you to change hostname, set IP addressing and some security, not much else. To manage the host machines, you have to use another piece of software; the VSphere Client which runs from a PC. I already had a copy of this installed on my laptop, from the tests that I had run earlier in the year. However, I decided to get the latest version so that we could start as we mean to continue.<br /><br />The update went through quite quickly and after about 15 minutes, I had the logon dialog box. Put in the correct IP address and logon to the host; except that it came up with an "invalid user name or password" message. I checked the details and they were correct. I double checked the details; domain, username, password. They were definitely all correct. After having stared at this for a few minutes, I then realised that the host installation had used a US keyboard layout and I was inputting the details using a UK layout keyboard. When I re-entered the same details using the US layout, it let me access the host. And it appears that there is no UK layout option available on the host installation routine.<br /><br />Looking at the details of the host, I could creat VMs and allocate resources; but this wouldn't allow me to manage the other hosts. To do this, I had to install the VCenter Server product and use to do all the management. The idea was that this would be installed on the first VM, but when I tried to install the software, it produced an error stating that it was not possible to install the server software on a VM. This made no sense; the material that I had received all indicated that the best practice would be to install the VCenter Server on a VM.<br /><br />After some analysis, the solution became obvious; I had the wrong version of VCenter Server. I had downloaded it from the VMWare web site; once you get used to the site, it is quite sensibly laid out, but to start with, it can be a bit overwhelming. When I checked, there is a particular set of downloads to match the version of VMWare that we had purchased, and this was where I should have got the software from. So I downloaded that version; and yes, it installed straightaway.<br /><br />So far, so good; I had the hosts running, the SAN was available and with the VCenter Server software installed, I could see all of the hosts and start to do some more detailed work. Unfortunately, we have a number of projects on the go at the moment, so I was involved in another one for a few days before I could get back to playing with the VMs.<br /><br />When I did get back to the virtual platform, I wanted to make the storage on the SAN unit available. I was able to initiate the iSCSI connectors and these showed the disk allocation on the SAN unit. However, these were not available to the VMs; it needs a check box to be ticked for this to happen.<br /><br />Later on, I realised that we still had an issue; although the storage area was available to VMs on the one host, it wasn't available to the others. Further checking revealed yet another setting (this time on the SAN itself) that needed to be checked, and as soon as this was done, each of the hosts could see all of the storage areas.<br /><br />Unfortunately, I got this resolved after I had created the first VM and installed the VCenter Server. This means that the image and the virtual disk are actually stored in a local drive on the host server, which is not quite what was planned. It appears that this can't be moved using the VMotion process; but I may be able to get around this by using the P2V function at a later stage. If this works I'll write another piece about that later.<br /><br />So at this point we had all of the hardware installed, all of the software licensed and running, our first VM created and some templates ready for future use. We can now manage the systems and have experiemented with copying, snapshotting, moving using the VMotion process, modifying resource allocation on VMs and deleting the various unwanted bits. It has taken a bit of time, but now there is a good level of confidence in the product and we are comfortable that we can move to the next level. And there will be more on that next time.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-85099296230065962722010-12-15T23:46:00.000-08:002010-12-16T05:32:18.296-08:00BCS - Retro computingMost IT staff work in fairly small groups; even in the larger companies, teams break down into groups of just a few people. As a result, it's easy for people to develop a "silo" mentality, and forget that there is a larger world out there.<br /><br />For that reason, I like to try to get to various events where there is an opportunity to speak to others within the profession. It's really useful to be able to share ideas, talk about common problems, to know that there are other people that have exactly the same pressures on them and all too often, the same feeling that their work is not appreciated.<br /><br />The BCS in the South West organise a number of events throughout the year, although there tend to be more during the Winter and Spring terms. During the Summer months, most of the organisers are busy with educational exam systems as they tend to be in academia.<br /><br />The latest event at the University of Plymouth was a talk on "Retro computing"; a look back at some of the hardware and software systems of the last half century. It was quite amazing to recall the changes that have occurred over that time, to see once again the boxes that seemed so modern and powerful at the time.<br /><br />They had an amount of older equipment on display, items that have been picked up over the years and kept to be part of a "museum of computing". People had the opportunity to use a few of these old devices; it was quite interesting to be able to once again play a game of Lemmings on the old Amiga. <br /><br />However, it wasn't just about games; they had some emulation software there that showed how some of the older systems used to run and what kind of business systems were running on them. As someone who had once had the opportunity to create a program from scratch, by designing the flow chart then creating the commands on a series of large punch cards to be processed on the main frame at County Hall, I had a strange sense of nostalgia.<br /><br />For some of those there, most of the hardware was beyond their recall; several students were actually younger than some of the exhibits, which is quite a scary thought! It just makes me wonder if my nice new shiny HP laptop will seem as ancient and irrelevant in another 20 years.<br /><br />The BCS South West are also starting a new web site to act as a repository for some information on older computing. The site is there but nothing is available just yet (http://retrocomputing.org). I'm told that they intend to slowly build this up with the help of a few volunteers in the months to come.<br /><br />In all, it was a really interesting evening, with a lot to see and do. It was also amusing to see who were the highest scorers in "Crazy Taxi"! Clearly there were a lot of the people with grey in their hair that had spent just as much time playing games as some of the younger generation.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-27732178946308701922010-12-11T08:46:00.000-08:002010-12-13T03:24:09.313-08:00Sec-1 Penetration WorkshopOn Friday 10th I went to a workshop event held in Bristol. It was organised by Sec-1 a specialist security firm http://www.sec-1.com/ - note the correct address, if you get it wrong you end up at a completely different type of business!<br /><br />Obviously, these events are to promote the company and their services; however, it wasn't just a massive sales pitch. The main purpose was to offer people advice about maintaining good security practice by illustrating just how easy it is to break into systems and highlighting the reasons why.<br /><br />The speaker was Gary O'Leary-Steele and he spoke with passion, conviction and great deal of knowledge. He indicated that they have carried out many investigation tests over the years, and in most cases they could use the same report over and again, but just change the name of the organisation. This is particularly the case in the 150 NHS trusts they have investigated, but is also often true of many private sector businesses.<br /><br />He stated that in many cases, people have failed to adequately install patches which have been issued for specific problems, often long after the issue has been identified. As it happens, I did a quick search on MS06-040 & MS08-067, the two main culprits and the autocomplete worked in each case after just the first 4 characters, the problem is so well known.<br /><br />He went on to discuss some of the most common problems and illustrated how they could be used to access systems. He also went on to demonstrate how easy it can be to identify vulnerable systems, get access to accounts with innappropriate levels of security permission, crack passwords and elevate permissions. In most cases, the team of testers expect to get access within 30 mins - if they take longer than an hour, the others tease them unmercifully!<br /><br />Most of the tools that they use are available quite freely on the Internet. In some cases, they do use items that have been commercially written and there is a small charge, but generally those ones are for the real high end stuff. Each has their own favourites in much the way that people do with most other kinds of software.<br /><br />Whilst going through the potential problems, Gary also indicated some of the possible solutions, often by using the software tools to confirm the problem, then implementing suitable practice or policy to ensure that something is done to minimise the problem or reduce the impact.<br /><br />It should also be identified that many of the exploits that were identified were in Microsoft OS or software; but the speaker also very carefully highlighted that issues are just as prevalent in other software products. Mac, Linux, Adobe etc, were all shown to be just as insecure. In many cases, this was due to installation or configuration, but equally there were many flaws straight out of the box.<br /><br />I'm not a security specialist, although I have had some training in this area. I also enjoy some of the work involved, although it has to be said I don't think that I have the necessary skills to make this my specialism. However, I think that I know enough to be able to state that there are a lot of people that suffer with "delusions of adequacy"; they think that because they use a particular product, or do a specific thing, that makes them invulnerable. Often, they are so wrong that it is difficult to know how to take them seriously in anything.<br /><br />I'm going to say that it was a great day, a really useful workshop and I was very impressed by the whole event. If they organise any more (and I'm told they certainly hope to) I would very strongly suggest that you grab the opportunity to get along and take advantage of the information and advice that they are willing to hand out free of charge.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-59756314611915759832010-12-05T10:33:00.000-08:002010-12-05T11:34:05.253-08:00V TwoFollowing on from last week's blog.<br /><br />So we bought the hardware, and after it had been delivered, installed everything in the rack, and sat back to start planning the installation. I started up one of the host machines to get a look at the POST and boot processes. To my surprise, an operating system had already been installed - and it was Windows Server 2008 R2 Datacenter. We had purchased the licences for this, but hadn't expected that they would pre-install it.<br /><br />Well no problem, just have to install the VMWare ESXi. I had a version of the ESXi software, but it was an older version, so first I had to download an updated version of the software which was an .iso image, then create an install disk. Having created the disk, I was then able to do the install. I was really quite surprised; it went through very quickly. Very little to see, just a few linux type screens showing the progress of the install. But after just under 15 minutes, it was all done. <br /><br />So obviously, it also made sense to do the other two hosts at the same time. Away I went and the second machine was done in much the same time, everything complete with no issues. I then started the third machine, and decided to go for a quick cup of tea as there seemd to be no point in me hanging around watching a series of dots advancing across the screen.<br /><br />But when I got back, I had a bit of a shock; the process had stalled part way through. The equipment didn't seem to respond to any keystrokes, so I took the disk out to check if there was a fault, but it didn't seem so. I tried to start the install again, and unfortunately, once again it stalled. A third attempt fared no better, so I decided to take a break and look at the vSphere client install whilst I thought about what could be the issue.<br /><br />I already had installed a copy of the latest version of vSphere client on my laptop for our test a short while ago, and just had to change the logon details. It connected to the host machines without any issues and I could play around with the various bits. I even did a quick install of a guest Operating System to create my first Virtual Machine. Everything looked really good.<br /><br />However, I then noticed that there seemed to be something odd about the disk allocation on the datastore on the server. There were several partitions, none of which I had created. Worse, it seemed that several of these were unusable by either the VMware or by the guest OS. Having given it some thought, it seemed to me that when the ESXi software was installed, it didn't re-partition the disk in the way that might be expected, and part of the disk would never be available for use, which might be an issue.<br /><br />At that point, it seemed appropriate that I should go back over the ESXi software install. I did this, checking the process, and at no point did it actually indicate that there was an option to manage the partition. In the end, I simply put the Windows disk in, then used the install routine to start up, and delete all existing partitions. After that, I ran through the ESXi install, and this time, it made all of the disk available for use. I then decided that I would do the same on the others, and the second machine completed without any issues.<br /><br />The third machine also allowed me to delete the partitions OK and there seemed to be be no reason why the ESXi software shouldn't install. But still it would only go so far, then it stalled everytime. I went through this a couple of times, before going back to my desk to give it some more thought. And at that point, I discovered the reason why, and it was so frustratingly simple, I am almost embarassed to tell you what it was.<br /><br />We use a very clearly structured IP address range within our network; servers get a static address in one subnet, and all addresses assigned via DHCP are in a slight different subnet. The address that I had input as part of the install routine was an address within the server range and one that had been specifically reserved for the virtualised platform. <br /><br />But somehow, the address allocated for the third machine had also been given to a secondary network card on an old server. Someone had added a cable to the NIC and then plugged it into a network point. The install routine had failed because it detected that the address I tried to give it was already in use! Once I sorted out the superfluous NIC, the install routine went through without any more issues.<br /><br />At this point I had 3 host machines, all installed and a connection to each tested with the vSphere client software. A good start and I felt that I was starting to understand VMware. I still had a few other things to go over, but I was feeling really quite positive about the various processes and was looking forward to getting on with it.<br /><br />But the next step will have to wait for another day 8-)Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-53593745012375611342010-11-26T11:25:00.000-08:002010-11-26T12:54:37.026-08:00V for VirtualFor some time now, we have been looking at a project to implement virtualisation. I decided that this would make for some interesting blog entries, and I thought that I might focus on this for a while.<br /><br />First of all, I suppose that I should go right back to the beginning to explain some of the reasons behind the decision. When I first joined the company, the servers were mostly tower models, that were stood on a table in a small room. These devices had limited processing power, low memory and disk storage even by the standards of the day, and were not really up to the task required of them. It should be said that they were most definitely not cheap, but certainly could not be described as being good value for money.<br /><br />It was identified that we needed to buy some newer machines to replace this old equipment and as a matter of some priority in order to provide urgently needed resources. As part of the project, it was agreed that we would move to rack mounted equipment; this made far better use of the available floor space, we could get a lot more in the same area. The equipment was not totally top of the range, but was very good quality, a good specification and thanks to some quite keen negotiation (though I say it myself) was pretty good value for the money.<br /><br />This made a huge difference to operations. Within a short time, staff could see significant improvements in speed of operation, we had much better storage facilities, and it was all much more flexible. This all helped demonstrate that the investment was appropriate; and I was also able to confirm some of the benefits using some standard metrics.<br /><br />But that was some 5 years ago. That same equipment is still functioning, and thanks to some upgrades is still providing a good level of service. However, it has been identified that across the estate, much of the processing power is underutilised. Although some machines made full use of their memory, more than half do not. We have a couple of servers with disks getting quite full, but the rest are using less than a quarter of the available space. The most obvious excess is in the network cards; generally, they are using less than 5% of the available capacity.<br /><br />It was also identified that the specific servers were manufactured before the newer energy saving devices now available; they use quite a lot of electric power, both to operate and to cool. We ran some tests and found that they would operate just as well at a warmer temperature than had previously been used, and this helped to reduce the need for cooling, so it did save some electricity, but we felt that it should be possible to do better.<br /><br />Of course, it was also identified that with equipment getting 5 years old, there was an increasing chance that we would see some hardware failure. This was the main concern for me; it seems foolish to be miserly with spending on hardware, when a failure could cause huge losses to the business due to loss of data or operational capacity.<br /><br />After identifying the need for replacement equipment, we started to look at newer versions of the same hardware; this had a number of green options for power saving, but I was still concerned that we would be paying for extra capacity that never got used, even allowing for growth within the business.<br /><br />Like a lot of people, I'd heard about virtualisation, but wasn't sure if it would really work for us. I was offered the chance to see some Dell kit in action, along with the Equalogic SAN units. These were really impressive, and gave a lot of options. I also compared these to some HP hardware with StorageWorks; these looked a little better if also a bit more expensive.<br /><br />The next step was to consider what virtualisation software to use. I had some spare hardware and installed evaluation copies of both Hyper-V and VMWare. I also took the opportunity to see some Citrix systems in action. It wasn't really possible to do as a full a test as I would have liked due to pressure of work, but it soon became clear that the decision would come down between Hyper-V and VMware. I liked both and felt that either could do a really good job; it was just a case of which we felt we would be happier with in the long run.<br /><br />At this stage, I managed to get some basic technical books for the two software products; I had hoped that this would help to make the decision a bit easier, but unfortunately, it didn't really help at all. In the end, I decided that we would go with VMWare; the product looked a bit more polished, it's been around longer and is more mature. <br /><br />So at that point, I started to do some negotiation with the suppliers. This went on for a while, and yes, I played them off against each other. But ultimately, I managed to get a deal that I thought was worthwhile, that the supplier was happy with and that I could sell to the senior managers. There was a slight delay getting the stuff onsite, but it's all here now, and we are starting to install it; but that's going to be the topic for another occasion.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-60624109101964050362010-11-18T05:48:00.000-08:002010-11-18T08:33:49.460-08:00Bookworm part 2Just a (fairly) brief addendum to my previous post about the Amazon Kindle. I've taken a weeks holiday (I had a very nice time, thank you) and I made realy good use of the Kindle whilst I was away.<br /><br />I'd ordered and downloaded a number of books beforehand; a bit of a mixture, some thriller, some technical stuff, some historical and some classics. I should note that all of these were free! <br /><br />I'm not a sunbathing freak; I will do a bit of lying around, but generally get pretty bored after a while. I mostly used the Kindle in the evenings, after supper and just before going off to bed. However, there were a couple of occasions when I sat out on the balcony to catch some rays and used the Kindle to occupy my mind.<br /><br />The screen is really easy to read even in bright sunlight (and it was bright) and the text is really clear. Changing pages is really simple; the buttons are on each side and have a nice solid feel to them. Changing books is not too difficult; but I do feel that the square button with ring for the selection and entry functions is a bit less solid.<br /><br />If I had tried to take the same books with me in paper format, I would have required a much larger suitcase; stood on top of one another, they would have been at least 35-40 centimetres in height (14-15 inches in old money). <br /><br />There is no doubt in my mind, the Kindle is a great little toy. If I didn't have one, I would say that it would be top of my wish list. I would say thought that I would advise getting a proper cover for it; I got a rather nice black leather one, but there are others in different colours and patterns. But each to his (or her) own.<br /><br />I haven't seen the Sony e-reader, so can't compare it; but I have shown my device to some others who seem to think that they prefer the Kindle. (But that's just their opinion.) <br /><br />At some stage, I think that I will subscribe to a magazine as well, and I'll do a write up to confirm how I get on.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-40802025567083203232010-11-02T05:08:00.000-07:002010-11-02T05:38:31.760-07:00Springboard Tour 2010It's been a pretty busy weekend. I went up to Wembley to watch the NFL and stayed overnight so that I could get to Reading early on Monday morning to visit the Microsoft campus for the UK leg of the Technet Springboard Tour. This event was the only one in this country; the others are in major cities across Europe. <br /><br />http://springboardseriestour.com/<br /><br />The Springboard tour is about promoting the latest technology and providing opportunities for people to see the products in use. They also covered some of the reasons for migrating to the latest versions and highlighted tools and resources that can be used to make the process a lot easier.<br /><br />I really like visiting the Microsoft Campus; there is always an energy and a buzz about the place that just makes you feel that it is great to work in technology. I believe that all too often, those of us at the sharp end get very isolated and develop a silo mentality to the work we do. It's important to take the chance to get out to see other people and understand that we are all part of a much larger community, that there are others that have exactly the same kind of problems and that there is more than one way of tackling the issues that we face.<br /><br />The presentations were introduced by Stephen Rose - and I have a link to a video that he made a while ago. He says that he had drunk about 2 gallons of coffee before the filming and I can believe it!<br /><br />http://www.youtube.com/watch?v=H2ewOGNGmZY<br /><br />During the presentations, they made really good use of the demos to show just how you might improve the rollout and migration process. The tools provided are all available through the Technet site and many are improved versions of things that are already in use. There was someone with a video camera filming the event, so some of these may be added to the main site (link above) in addition to the preprepared videos.<br /><br />Unfortunately, the sessions slightly overran - and there were a number of people that had to leave early, missing the final demo. This was of the Diagnostic and Recovery Toolset (DART). I'd very briefly heard of this before, but hadn't really had the chance to work with it. It looks like a really valuable asset for anyone providing any level of support to end users, and in particular anyone providing support for fatal errors. We will definitely be downloading it to give it a try in the next few weeks.<br /><br />There was a bonus for those that attended; a free copy of Office 2010! There were also a few other little giveaways and prizes just to say thanks for being there. If you missed it, then you would have to go to one of the events on the continent, as there won't be another one in the UK. However, the presentations and information on the resources are on the Springboard site and I would recommend that you take the time to check it out.<br /><br />As you may gather, I found the whole day a very good use of my time and really enjoyed the chance to talk to the various people. I am sure that I will be making really good use of the information that I picked up there in my daily work over the next few weeks.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-55341137029468663372010-10-28T08:41:00.000-07:002010-10-28T09:43:43.813-07:00BookwormI've always been a bookworm. As a child, I was one of those that used to take a torch to bed so I could read under sheets. I used to go to the library and draw out a couple of books and read through them in a matter of hours.<br /><br />Even now, I have large personal store of books. At the last count well over 700; a mixture of hard back and paper back. About 150 of these are technical reference books for various things or books for my studies.<br /><br />When the concept of the ebook reader was first publicised, I was quite keen to see one. I thought that the concept was good and could see real value in it; but I wasn't quite so sure about the price. I've been hoping that some kind person would buy me one for a present (yeah right!) or that I might win one in some prize draw. But sadly, no such luck.<br /><br />Anyway, a couple of weeks ago I decided that it was time for me to get one for myself. I had a number of Amazon vouchers which were from various sources, and I decided that I could trade these in as part payment on a Kindle. I bought one and a small leather wallet to keep it in. I also downloaded the software and got a number of free ebooks from the Amazon site.<br /><br />The Kindle turned up just over a week ago, and I've been playing with it ever since. It is so good! The text is really easy to read even in strong light; I don't need to change the font size although that is an option. I had a couple of issues getting in synched through the wireless, but that was down to me typing the encryption key in wrong. Once I got correct, the device connected and updated everything straight away.<br /><br />I've already gone through a number of books, and really enjoyed using the device. I don't think that I'm going to have a problem as it is supposed to hold about 3500 titles. At the moment, I've got some 2 dozen books stored; that should be enough for me to take on holiday in a couple of weeks. <br /><br />The alpha numeric buttons are a bit on the small size, but as I don't use them that much, I don't see that as an issue. There are a couple of big buttons on the side to change pages and they are quite firm to use. The only real criticism is the silly button with the tiny square around it for the selection / entry; I'm sure that they could have designed something a bit more solid.<br /><br />The Kindle also gives the option to have newspaper and magazine on the device; as you have to pay for those, I'm not so keen on the idea. But there is a particular magazine which I might sign up for, just to try it out. At 99p per month, I think that I can afford it. It's also supposed to allow you to read certain other types of files, but I haven't tried that yet.<br /><br />As you can tell, I think that this is a great little device. I'm really pleased that I bought it, and I think it's well worth the money.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-2505231806010972010-10-04T03:51:00.000-07:002010-10-04T04:06:01.848-07:00SharePoint Saturday 2010 UKA couple of months ago, I first heard about the SharePoint Saturday UK event – not sure if it was through a tweet or an email. There have been a number of similar events around the world before, but this was the first in the UK. <br /><br />http://www.sharepointsaturday.org/uk/default.aspx<br /><br />I’m always interested in these types of events as they offer you the chance to learn new things, brush up on existing skills, and reinforce knowledge. It also offers the chance to network with other people in the industry, which I consider is always a useful exercise. On top of that, you often get the opportunity to speak with people that have highly specific knowledge of their topic.<br /><br />SharePoint is a product that I have experimented with but purely for evaluation purposes. I believe that collaboration between staff is going to become a major initiative, and SharePoint is a tool that can really help bring people together and allow them to work more sensibly. I hoped that the event would enable to learn more about the latest iteration of the product and understand more about what it can do and what limitations it has.<br /><br />The event was held at the Birmingham Hilton Metropole hotel at the NEC. This is a very nice location, quite central for most people (although a bit of a journey for me). The hotel had a lot of suitable resources and I think that it was a great location for the event. I should also add that the event was free to attend!<br /><br />There were a really good mix of topics – some were quite technical, some were a bit more of a high level overview, so there was plenty for most people to get involved in. A couple even involved some demos of various issues which were really helpful. I particularly enjoyed the PowerShell administration demo by Penny Coventry; as I have been recently doing some work in this area, I was able to relate it to the stuff that I had been looking at, and had the chance to clarify a couple of small issues.<br /><br />What was quite amazing was that the individuals organising and speaking at the event were doing so on their own time, and travelling to the event at their own expense. When you consider that a couple of them had travelled from the States, South Africa and further afield, this shows a particular level of dedication to the concept of passing on knowledge. Many other people have expressed their gratitude, and I think that I have to add my thanks as well; they certainly deserve high praise.<br /><br />I also have to say that the buffet lunch provided was really excellent. I have to get the recipe for the Blue cheese, mascarpone and red onion quiche tartlets - they were really delicious and I must admit that I ate more than a few of them! Not good for the waistline, but for a one day event, very enjoyable indeed. My compliments to the chef!<br /><br />Another big thank you has to go to the event sponsors; apart from paying for the whole day, they provided a large number of valuable prizes which were awarded at the end of the day. Among these were a Kindle, an iPad, an Xbox, about 70-80 books, t-shirts as well as some really valuable licences and training offers. There was almost enough on offer for most people to walk away with at least one bit of swag.<br /><br />The day finished with SharePoint Saturday 2010 UK turning into SharePint; the chance for everyone to head for the bar. I carried out a completely unscientific study amongst a number of those present, and it was clear that everyone had had a great day; learned a lot, had the opportunity to see some really valuable demos and network with other like minded people. <br /><br />If you missed the event and want the chance to see another, I would bookmark their web page and watch out for next year. I get the feeling that they hope that this can become an annual event. Certainly I wish them well; the work that was put in to organising it deserves the recognition, and I think that it could become a very valuable resource for anyone interested in learning more about an under rated piece of software.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-56116008454371187622010-09-29T08:51:00.000-07:002010-09-29T08:53:53.040-07:00The birth of a Third PlatformThe BCS South West region hosts a number of events; I like to go along to these as they usually include some very interesting topics, but it’s also quite useful to network with other IT pros from different backgrounds.<br /><br />At a recent event, there was a guest speaker from Apple; Lawrence Stephenson talking about “The Birth of a Third Platform”. He was discussing the rise in use of iPhones and iPads, particularly by students at schools or in University / Colleges and proposed that this is a new form of computing. Although primarily about higher education, much of what he discussed was also relevant to business.<br /><br />The basic argument was that the mainframe systems were the first generation of computing, and the standard client / server technology that we have become used to, is the second generation. The third generation is therefore the use of mobile computing devices as access points to process or make use of data; hence the “third platform”. <br /><br />He illustrated his talk with some interesting facts about the growth in the numbers of smartphones and tablet devices particularly among students. He also compared how these are used; to access email, social networking sites, general web browsing etc. He also identified that there were some were using their devices to access relevant items related to the student courses, but this was still a relatively small amount and that there was potential for growth in this area. <br /><br />He demonstrated by showing some apps that had been developed for a university in the States; and these were clearly items that a student would find tremendously helpful, particularly for those new to university life, such as campus maps etc. All in all, a really good demonstration of just what can be done.<br /><br />There was one very interesting comment though; he showed some statistics that could be used to suggest that most people actually use their device more for accessing data than they do for making phone calls. As such, there could be an argument for saying that it is quite possible than some future device might not actually have a phone capability as such; you would be more likely to contact people using IM or calls would be routed through an IP based utility such as Skype.<br /><br />Of course, these types of devices are not new; tablets have been around for some 10 years. However, the advent of the smartphone has encouraged the development of small apps that allow people to do specific tasks really quickly and easily, and that has made a huge difference in the take up of people using mobile computing. As people have found new uses, it encourages more people to make use of them, and more developers to consider writing apps for specific requirements.<br /><br />Most companies have “road maps” that give a structure to their research and development process and show the customer what they are working on for future products. Apple are a bit tight lipped about their vision for the future, so it is difficult to be certain about what they have in the pipeline. However, I would suggest that they (and many others) are working on the basis that there will be more people wanting to make use of mobile devices. <br /><br />Who knows; maybe in the not too distant future, we won’t be using PCs any more, but will just do all of the work using a mobile device.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-36377653535780041162010-09-13T01:28:00.001-07:002010-09-13T01:29:14.340-07:00Watch the pennies..and the pounds look after themselves. So the old saying goes.<br /><br />Yes it’s that time of year again; time to think about next year’s budget. Our company financial year runs from 1st Jan to 31st Dec. The FD needs to check it over and approve it and he needs some time to cook the books (sorry, prepare the COA), so we need to get budget plans drawn up a few months before December.<br /><br />I tend to start by writing a list of the specific jobs that we intend to do, plans to replace major hardware such as UPS or servers, major software upgrades like the move to Windows XP a few years ago. It can also include work that we think we will be required to do; currently we are waiting for the go ahead on new offices and they will have to be cabled up. I try to get a quote so that we have a fair estimate of the cost.<br /><br />We have a lot of specialist software for CAD drawings etc. and these have quite expensive support costs. Added to that is the support costs for CRM, ERP and so on. In some cases, I think that it would be useful if these were in a separate budget, but they are not so I just have to get on and deal with it. I also add in an amount for other software upgrades.<br /><br />The next step is to think about smaller hardware purchases; monitors, disks, cables, replacement printers etc. I also consider consumables; toner cartridges, disks and batteries. I try to work out what we have bought / used in these areas, then use that as a benchmark for the next year.<br /><br />We also need to plan for Business Continuity / Disaster Recovery. This requires that we keep some spare equipment, pay towards a BC / DR partner and take appropriate actions to make sure that we can be flexible enough (and secure enough) to put things into place at short notice to allow the company to deal with sudden problems.<br /><br />Once all of these items have been assessed, I put them into a spreadsheet. I tend to leave details on the form so that the FD can verify it; the more detail the better as it saves him pestering me. It also means that he sees part of the justification for the spending which gives him confidence that I have thought things through.<br /><br />In our case, I also try work out roughly what we are likely to spend on travelling to work at our other sites; hotel, mileage allowance, flights as appropriate. In addition, I also added an allowance for some training costs as we have had to learn a lot of new skills around our ERP system and the training is absolutely essential. As it happens, the FD generally accepts my figures (although he does occasionally make some changes to match his numbers). <br /><br />After all that however, the big thing is to try to stick to the budget. Sometimes this is easier said than done. Generally, the smaller amounts are easier to offset within a budget. For example, someone managed to destroy a laptop a few years (he ran over it; he forgot to put it in the boot of his car!) and we needed to replace this at short notice. We could just put that down as a replacement and not worry about it.<br /><br />However, it is also possible to get a requirement for much larger items – we had to buy an add on disk unit for a server which had not been budgeted for – not the end of the world, but it meant we had to be a bit careful about some other spending.<br /><br />But all in all, it seems to work for us. The FD is happy, the MD is happy, the staff are generally supplied with what they need, when they need it. We get to manage things ourselves, which is a lot better than having to justify every single item of expenditure. I’ve seen places where this happens, and I would not like to have to be working under those conditions.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-61140424681298017032010-09-03T06:09:00.001-07:002010-09-03T06:34:59.775-07:00New SkillzOccasionally, I think back to when I first started working with PCs in the late 80s. <br /><br />At that stage, there were relatively few companies made use of these and it was very much a hobby, although one that I enjoyed. I managed to get hold of some second hand equipment and by trial and error, worked out what everything was and how it worked.<br /><br />In the mid 90s, I had the chance to work with computers as a job; primarily in a customer support capacity, but I also looked after the company hardware, network and server (yes we only had the one). In those days, it was considered normal that someone working in IT would have a broad range of skills and be able to turn their hand to whatever task was needed. <br /><br />But in the last 10 years, we have seen a major change in the way that things work. There has been a considerable need for people to become more focussed in a specific area, whether that be database administration, programming, networking, telecoms etc. In the very big companies, they even have teams of people within these disciplines.<br /><br />For the smaller shops like ours, this makes life a bit harder. We only have a couple of staff, but we still need to provide the same level of support on the newer systems. There is still an expectation that each of the IT staff has all of the relevant knowledge to instantly know how a product works, what is causing a problem, and with a wave of the magic wand, can fix it. <br /><br />In the real world of course, it is completely different. In most cases we have some good general knowledge of hardware and some good experience of using a couple of products. We’ve then developed particular skills in specific areas. For example, I have had to do a lot of work with SQL server over the last couple of years, and although I wouldn’t describe myself as a DBA, I have a pretty good understanding of it. I also have had advanced networking and routing training, as well as some extra work in security. <br /><br />Among the staff, we have each developed key specific skills; and we can share the work out in a way that allows us to be most effective. As a small team, we work quite closely, so still get the opportunity to broaden our skills base, probably far more than those in larger teams would be able to do. But we still have to learn those new skills and there is no question that even within a team the size of ours, there is a definite division of labour based upon speciality.<br /><br />There are of course many companies that suggest we should outsource some of the work: and I can see a certain value in that. But I have not yet seen any outsourcing operation that will provide the level of support at an acceptable price that meets what we currently provide. It’s also likely that if we did outsource part of the work we do all that would then happen is that the users / management would still insist that we try to fix things for them anyway, defeating the purpose of outsourcing.<br /><br />So for the moment, we just have to try to learn as much as we can, as quickly as we can (and probably as cheaply as we can). I’m looking forward to the day when we can get the plug in brain nodes that allow us to download information directly into our brains, without the pain of going through the learning process!Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-57393199060528850162010-08-26T02:03:00.000-07:002010-08-27T07:46:25.849-07:00Cool!Some years ago, we undertook a small experiment with our server room. We had heard that other people were reducing the amount of A/C cooling they used and we wanted to see if it was appropriate for us.<br /><br />Like a lot of other places, our small server room was kept cool to keep the servers cool; if we were to spend any length of time in there, we would need to put on a jumper or even a fleece to stay warm, as the room was around 10 degrees centigrade. The A/C units were running non stop, and we wanted to see if we could reduce the electric we used.<br /><br />Essentially, we made a load of measurements to get a base line. These included the core temperature of the UPS, some measurements of the servers and various places within the server room. We were fortunate that our engineering manager had a device that we could borrow for this as he was conducting a number of tests to help the company work towards ISO14001.<br /><br />He also had a device that allowed us to measure the amount of power drawn by various devices – we seemed to get a couple of slightly odd readings, but when we discounted those, the average values appeared to match what would be expected. We therefore assumed that the errors we had were down to incorrect use.<br /><br />Having got our base line values, we then started to increase the ambient temperature of the room, and examine what affect this had. Each time, we would leave the changed settings for a couple of weeks to see what would happen; in each case, there was no sign of distress on the servers, so we were able to increase the temperature again.<br /><br />After some time, we found that the “sweet spot” was between 20 and 24 degrees centigrade. Above 24 degrees, we would see the fans in the servers starting to work much harder and draw more power. Below 20 degrees, the A/C was still running almost all the time. However, in that range, we found that we had the A/C unit running at its least power draw whilst the servers ran at a comfortable level.<br /><br />We found that in the racks, we had a few “hot spots”; places where the temperature was quite a bit higher than the ambient temperature of the room. We were told that this is normal and generally considered a good thing; these create a thermal current that allows the cooling to happen naturally. The interesting thing was that although the ambient temperature increased by 12 degrees, the hot spots only increased by 3-4 degrees.<br /><br />Part of the work meant that we had to make sure that the racks were properly positioned in the room to allow for adequate air flow, and the direction of air from the A/C also had to be optimised to prevent “air curtains” forming at various places. We also had to make sure that things such as blanking plates were used to ensure a properly controlled air current within the racks.<br /><br />Although this all sounds very grand, the room is quite small and most of the work was done in between our normal activities. We were able to make use of some additional advice from the A/C supplier, but that was relatively minor. The total amount of work required was actually quite small, but the results have been very good. We have seen a reduction in power consumption of just under 50% for the server room as a whole – which translates into significant cost savings.<br /><br />I’ve added a link to a resource that I would recommend to anyone wanting to do work on their server room facilities. It is primarily aimed at North America, but there are some bits that are specifically for the European market. It will take some time to go through all of it, but I consider that it would be time well spent.<br /><br />http://www.schneider-electric.com/sites/corporate/en/products-services/training/energy-university/energy-university.page?tsk=77518T&pc=26947T&keycode=77518T&promocode=26947T&promo_key=26947T<br /><br />The really good thing - we now have a server room that we can work in, in reasonable comfort all the time!Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-87210420261829508412010-08-24T08:09:00.000-07:002010-08-25T06:45:09.569-07:00I'm back!It's been a while since I posted anything; 6 months in fact. It's not a case of having nothing to write about, far from it. I've just been very busy, plus I've been a bit more active in other areas.<br /><br />One thing that I thought would be appropriate to point out is a Microsoft resource at: http://www.microsoft.com/uk/business/peopleready/technology/ioassessment/osyci/survey.mspx<br />This allows you to take a "survey" that can give you an indication of the status of your IT provision. I first came across this a while back and I found it very useful as part of the planning process. In order for you to reach a particular destination, it helps to know where you are starting from, so you can use the right directions.<br /><br />Essentially, Microsoft suggest that IT departments can be classified into one of 4 levels based upon standard practice. Five years ago, we would have definitely been classed as being at the lowest level, "reactive". The IT provision was based around fixing problems after they occurred and very little thought went into planning or preparation. <br /><br />We've slowly moved through the various stages, going from "standardised" to "rationalised", and are now pretty much at the top level, "strategic". There are still a few areas that we could improve upon, but that will always be the case. However, the IT is now a solid platform that people can use. We don't get the network failures, system crashes, or data losses that used to occur. Resources are there and available 24 x 365 for people to use, and generally they can access them using whatever device is appropriate.<br /><br />Now although this all sounds great, there is unfortunately a fly in the ointment. The biggest problem is still the unit that is positioned between the chair and keyboard! It has been identified that we need to get people better trained, but somehow that never seems to get translated into action. Once of the worst instances was of a person that had been with the company for some 8 years. Unable to logon, the person phoned the helpdesk to ask what her user name was! (She normally didn't have to type that in, as it just appeared in the login box.)<br /><br />I would encourage everyone to take a look at the Microsoft Core Infrastructure Optimisation resource. I think that you'll find it of significant value and help.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0tag:blogger.com,1999:blog-8966928343955870830.post-81366284984037018402010-02-12T00:45:00.000-08:002010-02-12T01:18:50.340-08:00BCS - Computer ForensicsFor some time, I've been working towards a post graduate degree through the Open University. It's hard work, particular after a long day when all you want to do is switch off and relax. However, I find the courses fascinating and of help to me in my daily work, so I keep on working on it. <br /><br />The last course I did was particularly interesting - Computer Forensics and Digital Examinations. This is a very technical issue, but it also requires an understanding of legal procedures. It isn't enough to say "I found so and so", you have to demonstrate that the evidence is relevant, accurate, consistent and to present it in a way that non-technical people can understand it. I found it all really interesting, if not totally linked to my daily job.<br /><br />So when the BCS South West indicated that they were holding an evening event and the topic was Computer Forensics, I jumped at the chance to attend. It was at the University of Plymouth, which is a really nice venue, if a little bit of a trek to get to from where I live. The speaker was a visiting professor, John Haggerty from Manchester and the presentation was lively and informative. The actual notes should be available at this link. http://www.bcssouthwest.org.uk/server.asp?page=pastevents<br /><br />For me, the presentation covered most of the items that I has previously studied and it was really good to refresh my memory. It was also interesting to see that after such a short time since I did my course, there are a number of changes that have occured and the discussion after the talk highlighted some of the issues facing practitioners in that field.<br /><br />One thing that is of interest - Digital Forensics is a field that is wide open for people to move into. However, there are a lot of people that think just because they have a small amount of experience in running a computer, they think they know what to do to examine it. Professor Haggerty referred to this as the "CSI" affect - people see the TV shows where someone drives an expensive car, goes to a pristine work space and in half an hour recovers all the require information to solve the case (and the impossibly attractive woman is suitably impressed by the display of brain power!). <br /><br />In reality, Forensics is a long tedious job. Everything has to be documented, step by step and assumptions made have to be justified. There are a number of practioners that have had their reputations destroyed by a simple mistake, and once that happens, they are unlikely to be able to work in the field again. <br /><br />As the technology moves on, the process of the examination gets harder - I can remember when I bought my first hard drive of 20 MB and I wondered how I would ever fill it up. I now regularly work with physical hard drives of 500 GB and logical partitions of over 1 TB. To properly analyse and document such a drive can take a very long time and new tools are being developed to try to make the analysis easier, but it still requires considerable patience.<br /><br />But all in all, a great evening - a fascinating topic, well presented. An dfor those IT people that think the BCS is only for academics, I would strongly suggest that you go along to one of the (free) events - I'm sure that you'll change your mind.Tony Shttp://www.blogger.com/profile/10593246665388578814noreply@blogger.com0