Last week, I had the opportunity to take part in 3 training events organised around the new Microsoft Office365 product; the replacement for BPOS. These sessions were all online, run using MS Live meeting, with a mixture of PowerPoint slides and some actual demos of the product in use.
The sessions were started at 10.00 am Pacific Daylight time (18.00 BST) as they were being hosted from the West Coast of the USA. They ran until 4.00 pm PDT which meant staying up until midnight, very much a long evening, particularly as I usually get up at 6.30 in the morning. However as the event was so worthwhile, I don’t feel too put out by that.
(http://blogs.technet.com/b/uktechnet/archive/2011/05/11/register-now-for-the-office-365-jump-start-for-it-pros.aspx)
On the first day, they had a few technical issues with the audio at the very beginning of the session; for some reason, they kept losing the sound from the presenters. However, once that little hiccup was out of the way, the sessions picked up pace quite rapidly and they went through a great deal of information.
The moderator was Adam Carter who kept things moving along really nicely; he was joined by a number of people that had specific knowledge of key components of the package and these went into the various parts in some detail. At the same time, the online participants were invited to ask any questions; there were some really great issues raised and for the most part, the moderators were able to deal with these or to pass them on to the specialists for them to elaborate further.
I’ll write up a bit more about the actual product itself in a later blog post; suffice to say that the various components were explained and demonstrated very well. I would suggest most people had a really good opportunity to see them in action, learn a bit more about some of the basic administrative tasks required, and how to make use of the new product.
Of particular note was the session on using PowerShell to do some of the admin tasks; for those that are not so confident in using this utility or still working out if they need to use it, the demonstration showed just how flexible and easy to use it is, and I’m sure that many would have gone away determined to learn more about working with the commandlets.
For me the best demonstration was by Mark Kashman who gave a superb presentation on the use of SharePoint Online. He had created a demonstration site using the “Fabrikam” company name, and it was quite astonishing; simply one of the best SharePoint sites I’ve seen. A number of people asked if it would be made publically accessible as a reference site, and he has said they will look at this, but he felt that the site was still unfinished and that the team would want to do more work on it before releasing it into the wild.
All in all, this was a really great opportunity to learn more about the new Office365 product. It was very well put together and I think pitched at just the right level for most of the people involved. The slides are now available online to download –
http://borntolearn.mslearn.net/office365/m/officecrapril/default.aspx
They did suggest that the videos will also be available in a couple of weeks’ time, and if I get the details, I’ll add them on as well. They also promoted the Microsoft Virtual Academy, another really great free resource; if you haven’t heard about this, check it out at
http://www.microsoftvirtualacademy.com/Home.aspx
I hope that Microsoft put out a few more sessions like the jumpstart session; I would suggest that if they do, you would be well advised to sign up as it is a great training resource for IT sysadmins, to make that they stay on top of the latest products and developments.
Tuesday, 31 May 2011
Monday, 25 April 2011
DPM across Domains
I've been using the Microsoft System Center Data Protection Manager product for just under 4 years; and I really like the product. As far as I am concerned, it ticks a load of the relevant boxes; easy to install, easy to use and manage, and most importantly, it works really well. It backs up to disk, then from disk to tape. It uses a relatively small amount of bandwidth and data recovery is quick and easy. It is simply one of the best backup products that I have come across, far easier to use than many of the more well known software packages.
A while ago, the company bought out a partner organisation. This left us a sales office based in Paris; they are a separate entity, but as they are quite small, they don't have their own IT staff. They have been using the services of another business, but it was decided a while ago that we would take on that responsibility. We needed to provide a backup function to preserve their data, and set about putting this into place.
One of the key issues was that they did not have an Active Directory domain on site. Everything was set-up as a workgroup only, and this causes a lot of issues. So one of the first things to do was set-up a suitable domain structure. Hopefully, this will reduce the amount of admin work that is required; previously, it was necessary to create a local user account on every single piece of kit, which required a lot of work. The new domain was created a couple of weeks ago, and we've now also created a two trust between the two AD domains.
The next step was to set-up the remote site to be backed up by our DPM server, but this was where we hit a snag. Each time we tried to install the agent, it responded with messages that the remote site was not available. I could prove that this was false; I could ping the remote server and even RDP to it from the DPM server. I checked all sorts of things, and each showed that the remote site was fully operational and accesible.
So I decided to do a manual install of the agent on the remote site. The first step was to RDP to the remote server, then create a mapped drive back to the DPM server. Having done that, I then opened the folder where the DPMAgentInstaller.exe file was found - that's at \Program Files\Microsoft DPM\DPM\Agents\RA\\i386 and there are also options for AMD & 64 bit installs.
This actually went through OK, and having installed the agent, it's necessary to define which is the correct DPM server. This is done using \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName. Again this went through OK, but it still produced an error message that there were insufficient permissions to complete the process.
After having checked the event log, I was able to see a number of LsaSrv Event ID: 6033 errors. This showed that I should modify the registry key \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName to disable the anonymous logon block. Having done this, it then showed another set of errors taht indicated that there was still a problem with permissions.
Having checked these yet again, I could see that the DPM server was in the correct groups etc. but I also thought to put the DPM administrator account into the administrators group account. Having done this, the error went away, but the agent still wouldn't connect to the DPM server. However, I ran the SetDPMServer.exe utiltiy again, and this time, it completed correctly. When I went back to the DPM console, it showed the agent as installed and connecting to the remote server.
So now we are in the position where we can actually backup that remote site. It will be a bit of an issue to begin with as there is a lot of data on site. I'll probably go over again, to do a manual copy of the data to a portable hard drive. This can then be manually copied to the DPM server to get the initial data load, and then the synchronisation process will only work on the data that has changed from that copy; a great deal less than the full synch process.
This is going to make a huge difference to the people on the remote site; they won't have to worry about tapes etc. or what to do if someone goes on holiday. The data is being backed up off site, so is more secure. The recovery process is really simple and we can give them the confidence that we can deal with it really quickly if needed.
A while ago, the company bought out a partner organisation. This left us a sales office based in Paris; they are a separate entity, but as they are quite small, they don't have their own IT staff. They have been using the services of another business, but it was decided a while ago that we would take on that responsibility. We needed to provide a backup function to preserve their data, and set about putting this into place.
One of the key issues was that they did not have an Active Directory domain on site. Everything was set-up as a workgroup only, and this causes a lot of issues. So one of the first things to do was set-up a suitable domain structure. Hopefully, this will reduce the amount of admin work that is required; previously, it was necessary to create a local user account on every single piece of kit, which required a lot of work. The new domain was created a couple of weeks ago, and we've now also created a two trust between the two AD domains.
The next step was to set-up the remote site to be backed up by our DPM server, but this was where we hit a snag. Each time we tried to install the agent, it responded with messages that the remote site was not available. I could prove that this was false; I could ping the remote server and even RDP to it from the DPM server. I checked all sorts of things, and each showed that the remote site was fully operational and accesible.
So I decided to do a manual install of the agent on the remote site. The first step was to RDP to the remote server, then create a mapped drive back to the DPM server. Having done that, I then opened the folder where the DPMAgentInstaller.exe file was found - that's at \Program Files\Microsoft DPM\DPM\Agents\RA\
This actually went through OK, and having installed the agent, it's necessary to define which is the correct DPM server. This is done using \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName
After having checked the event log, I was able to see a number of LsaSrv Event ID: 6033 errors. This showed that I should modify the registry key \Program Files\Microsoft Data Protection Manager\DPM\bin\SetDpmServer.exe – dpmServerName
Having checked these yet again, I could see that the DPM server was in the correct groups etc. but I also thought to put the DPM administrator account into the administrators group account. Having done this, the error went away, but the agent still wouldn't connect to the DPM server. However, I ran the SetDPMServer.exe utiltiy again, and this time, it completed correctly. When I went back to the DPM console, it showed the agent as installed and connecting to the remote server.
So now we are in the position where we can actually backup that remote site. It will be a bit of an issue to begin with as there is a lot of data on site. I'll probably go over again, to do a manual copy of the data to a portable hard drive. This can then be manually copied to the DPM server to get the initial data load, and then the synchronisation process will only work on the data that has changed from that copy; a great deal less than the full synch process.
This is going to make a huge difference to the people on the remote site; they won't have to worry about tapes etc. or what to do if someone goes on holiday. The data is being backed up off site, so is more secure. The recovery process is really simple and we can give them the confidence that we can deal with it really quickly if needed.
Wednesday, 2 March 2011
Transformational Security
A couple of weeks ago, I attended an event hosted by Computer Weekly, SC Magazine and a couple of others. “Information Security Leaders 2011: Transformational Security” - as you might gather form the title it was a look at how and why things are changing and how to provide security in the newer IT landscapes.
Although a lot of people think that these are just junkets, with a chance to pick up some SWAG and eat and drink at someone elses' expense, I actually find these events very useful. Working within IT can have its problems; all too often, we work in small groups, and it's very easy to become isolated. This means that we develop set habits, and forget that there may be other ways of doing things.
Getting out to events like this can be really useful in many ways. It's interesting to talk to others in the industry and see just what kinds of problems they are facing. All too often, we might think that we are the only ones with a particular issue, only to find many other people with exactly the same problem. I really like to share advice and information on how we approach some of these and how and why we go down the route that we do.
This particular event was very useful. There were some keynote speakers that offered a real insight into just how things are changing and why; and they offered some considered advice on how to look at this as an opportunity. In particular, the concept of "consumerisation" was raised - people wanting to use their own equipment that they use for home based email, social networks etc, then wanting to use the same items for work use. (That's not just the same make or model, but the actual device).
At first, I thought that this was not an issue that we would face; but then I realised that it has already happened. We have a number of staff that have their own mobile phone (smartphone device) that are then trying to connect up so that they can get their email on the device. It's not been a major issue so far; but what would we do if one of those people then left the company? (OK, cancel their email account is a start, but what if they had access to someone else's account as well?)
Or how would you react if they lost their mobile device and someone else found it and then could use this to get access to company systems. The answers may seem simple, but as the speakers pointed out, this is the thin end of the wedge, and it's going to start happening a lot more often and involve a lot more devices and people.
All in all, the event was a good day (and yes the food was good!); it was also very useful from the point of view of getting people to think slightly outside of their comfort zone. If there are any more events of this type, either this year or in the future, I would strongly recommend thaking the opportunity to get along. You won't regret it!
Although a lot of people think that these are just junkets, with a chance to pick up some SWAG and eat and drink at someone elses' expense, I actually find these events very useful. Working within IT can have its problems; all too often, we work in small groups, and it's very easy to become isolated. This means that we develop set habits, and forget that there may be other ways of doing things.
Getting out to events like this can be really useful in many ways. It's interesting to talk to others in the industry and see just what kinds of problems they are facing. All too often, we might think that we are the only ones with a particular issue, only to find many other people with exactly the same problem. I really like to share advice and information on how we approach some of these and how and why we go down the route that we do.
This particular event was very useful. There were some keynote speakers that offered a real insight into just how things are changing and why; and they offered some considered advice on how to look at this as an opportunity. In particular, the concept of "consumerisation" was raised - people wanting to use their own equipment that they use for home based email, social networks etc, then wanting to use the same items for work use. (That's not just the same make or model, but the actual device).
At first, I thought that this was not an issue that we would face; but then I realised that it has already happened. We have a number of staff that have their own mobile phone (smartphone device) that are then trying to connect up so that they can get their email on the device. It's not been a major issue so far; but what would we do if one of those people then left the company? (OK, cancel their email account is a start, but what if they had access to someone else's account as well?)
Or how would you react if they lost their mobile device and someone else found it and then could use this to get access to company systems. The answers may seem simple, but as the speakers pointed out, this is the thin end of the wedge, and it's going to start happening a lot more often and involve a lot more devices and people.
All in all, the event was a good day (and yes the food was good!); it was also very useful from the point of view of getting people to think slightly outside of their comfort zone. If there are any more events of this type, either this year or in the future, I would strongly recommend thaking the opportunity to get along. You won't regret it!
Wednesday, 2 February 2011
Email signatures
Some time ago, it was suggested that we should have an agreed format for Email signatures across the company. Unfortunately, it took some time to get agreement on what format we should use. I could go into the details of this, but it's pretty boring; for example, the discussions on the font to be used seemed to take forever. Suffice to say that there were numerous discussions and it has taken quite a while for the final decision.
There are numerous sample VB scripts out on the Internet for producing an email signature, but none seemed to achieve what we wanted. I did think about trying PowerShell, but I don't yet know enough to be able to do the work using that. As I've used VB script on and off for a few years, it made sense to try and use that, at least for the time being.
The script has taken a little while to put together to make sure that it meets the needs of the business. It takes data from the Active Directory, formats it and places it in the required location. It also inserts a company logo, and there is a bit of conditional text to insert other logos; this is because we attend a number of trade shows, and like to promote these on our emails.
There is one slight issue; the email has to go out in Rich Text Format. If it goes as .HTML, the lines get double spaced. This is just because of the way that it gets rendered and I haven't found a way around this. Also if it goes out as plain text, the logo doesn't get inserted. It works by using Word - it extracts the AD data and sets the sig in Word before saving it in Outlook.
I'm putting the script below as I am quite pleased with it and the results; if it would be of any help, please feel free to make use of it. Just copy the text, place in a text file, save it and then change the extension to .vbs - I haven't tested it with all versions of software, but I have tried with Outlook 2003 / 2007 / 2010 on Exchange 2003 (on Server 2003), on PCs running Windows XP and Windows 7 and it worked in each case.
(Note that I have removed the specific details of our company so that it is a generic script; you would then have to modify it to show your own details.)
Enjoy!
====================
On Error Resume Next
Set objSysInfo = CreateObject("ADSystemInfo")
strUser = objSysInfo.UserName
Set objUser = GetObject("LDAP://" & strUser)
strName = objUser.FullName
strTitle = objUser.Title
strDepartment = objUser.Department
strCompany = objUser.Company
strOffice = objUser.physicalDeliveryOfficeName
strPhone = objUser.telephoneNumber
strFax = objUser.faxNumber
strMob = objUser.Mobile
strAddrs1 = "Site 1 Address"
strAddrs2 = "Site 2 Address"
strAddrs3 = "Site 3 Address"
strWeb = "www.domain.net"
Logo = "\\server\share\logo.jpg"
ShowLogo = "\\server\share\show1.jpg"
Set objWord = CreateObject("Word.Application")
Set objDoc = objWord.Documents.Add()
Set objSelection = objWord.Selection
Set objEmailOptions = objWord.EmailOptions
Set objSignatureObject = objEmailOptions.EmailSignature
Set objSignatureEntries = objSignatureObject.EmailSignatureEntries
objSelection.Font.Name = "Arial"
objSelection.Font.Size = "10"
objSelection.InlineShapes.AddPicture(Logo)
objSelection.TypeParagraph()
objSelection.TypeParagraph()
objSelection.TypeText strName & ", " & strTitle & Chr(10)
objSelection.TypeText strDepartment & ", " & strCompany & ", " & strOffice & Chr(10)
if strOffice = "Site1" then
objSelection.TypeText strAddrs1
end if
if strOffice = "Site2" then
objSelection.TypeText strAddrs2
end if
if strOffice = "Site3" then
objSelection.TypeText strAddrs3
end if
objSelection.TypeText strOffAddrs & Chr(10)
objSelection.TypeText "Tel:" & " " & strPhone & Chr(10)
objSelection.TypeText "Fax:" & " " & strFax & Chr(10)
if strMob <> "" then
objSelection.TypeText "Mob:" & " " & strMob
end if
objSelection.TypeParagraph()
objSelection.TypeText strWeb & Chr(13)
objSelection.TypeParagraph()
objSelection.TypeParagraph()
if strOffice = "Site1" then
end if
if strOffice = "Site2" then
objSelection.InlineShapes.AddPicture(ShowLogo)
end if
if strOffice = "Site3" then
end if
Set objSelection = objDoc.Range()
objSignatureEntries.Add "AD Signature", objSelection
objSignatureObject.NewMessageSignature = "AD Signature"
objDoc.Saved = True
objWord.Quit
There are numerous sample VB scripts out on the Internet for producing an email signature, but none seemed to achieve what we wanted. I did think about trying PowerShell, but I don't yet know enough to be able to do the work using that. As I've used VB script on and off for a few years, it made sense to try and use that, at least for the time being.
The script has taken a little while to put together to make sure that it meets the needs of the business. It takes data from the Active Directory, formats it and places it in the required location. It also inserts a company logo, and there is a bit of conditional text to insert other logos; this is because we attend a number of trade shows, and like to promote these on our emails.
There is one slight issue; the email has to go out in Rich Text Format. If it goes as .HTML, the lines get double spaced. This is just because of the way that it gets rendered and I haven't found a way around this. Also if it goes out as plain text, the logo doesn't get inserted. It works by using Word - it extracts the AD data and sets the sig in Word before saving it in Outlook.
I'm putting the script below as I am quite pleased with it and the results; if it would be of any help, please feel free to make use of it. Just copy the text, place in a text file, save it and then change the extension to .vbs - I haven't tested it with all versions of software, but I have tried with Outlook 2003 / 2007 / 2010 on Exchange 2003 (on Server 2003), on PCs running Windows XP and Windows 7 and it worked in each case.
(Note that I have removed the specific details of our company so that it is a generic script; you would then have to modify it to show your own details.)
Enjoy!
====================
On Error Resume Next
Set objSysInfo = CreateObject("ADSystemInfo")
strUser = objSysInfo.UserName
Set objUser = GetObject("LDAP://" & strUser)
strName = objUser.FullName
strTitle = objUser.Title
strDepartment = objUser.Department
strCompany = objUser.Company
strOffice = objUser.physicalDeliveryOfficeName
strPhone = objUser.telephoneNumber
strFax = objUser.faxNumber
strMob = objUser.Mobile
strAddrs1 = "Site 1 Address"
strAddrs2 = "Site 2 Address"
strAddrs3 = "Site 3 Address"
strWeb = "www.domain.net"
Logo = "\\server\share\logo.jpg"
ShowLogo = "\\server\share\show1.jpg"
Set objWord = CreateObject("Word.Application")
Set objDoc = objWord.Documents.Add()
Set objSelection = objWord.Selection
Set objEmailOptions = objWord.EmailOptions
Set objSignatureObject = objEmailOptions.EmailSignature
Set objSignatureEntries = objSignatureObject.EmailSignatureEntries
objSelection.Font.Name = "Arial"
objSelection.Font.Size = "10"
objSelection.InlineShapes.AddPicture(Logo)
objSelection.TypeParagraph()
objSelection.TypeParagraph()
objSelection.TypeText strName & ", " & strTitle & Chr(10)
objSelection.TypeText strDepartment & ", " & strCompany & ", " & strOffice & Chr(10)
if strOffice = "Site1" then
objSelection.TypeText strAddrs1
end if
if strOffice = "Site2" then
objSelection.TypeText strAddrs2
end if
if strOffice = "Site3" then
objSelection.TypeText strAddrs3
end if
objSelection.TypeText strOffAddrs & Chr(10)
objSelection.TypeText "Tel:" & " " & strPhone & Chr(10)
objSelection.TypeText "Fax:" & " " & strFax & Chr(10)
if strMob <> "" then
objSelection.TypeText "Mob:" & " " & strMob
end if
objSelection.TypeParagraph()
objSelection.TypeText strWeb & Chr(13)
objSelection.TypeParagraph()
objSelection.TypeParagraph()
if strOffice = "Site1" then
end if
if strOffice = "Site2" then
objSelection.InlineShapes.AddPicture(ShowLogo)
end if
if strOffice = "Site3" then
end if
Set objSelection = objDoc.Range()
objSignatureEntries.Add "AD Signature", objSelection
objSignatureObject.NewMessageSignature = "AD Signature"
objDoc.Saved = True
objWord.Quit
Tuesday, 4 January 2011
A third slice of V
Happy new year to one and all!
This post follows on from a previous item on virtualisation. We had installed the hardware, then the ESXi software - now to start getting serious.
ESXi does have a console to set-up certain key items, but these are very limited. Essentially, it allows you to change hostname, set IP addressing and some security, not much else. To manage the host machines, you have to use another piece of software; the VSphere Client which runs from a PC. I already had a copy of this installed on my laptop, from the tests that I had run earlier in the year. However, I decided to get the latest version so that we could start as we mean to continue.
The update went through quite quickly and after about 15 minutes, I had the logon dialog box. Put in the correct IP address and logon to the host; except that it came up with an "invalid user name or password" message. I checked the details and they were correct. I double checked the details; domain, username, password. They were definitely all correct. After having stared at this for a few minutes, I then realised that the host installation had used a US keyboard layout and I was inputting the details using a UK layout keyboard. When I re-entered the same details using the US layout, it let me access the host. And it appears that there is no UK layout option available on the host installation routine.
Looking at the details of the host, I could creat VMs and allocate resources; but this wouldn't allow me to manage the other hosts. To do this, I had to install the VCenter Server product and use to do all the management. The idea was that this would be installed on the first VM, but when I tried to install the software, it produced an error stating that it was not possible to install the server software on a VM. This made no sense; the material that I had received all indicated that the best practice would be to install the VCenter Server on a VM.
After some analysis, the solution became obvious; I had the wrong version of VCenter Server. I had downloaded it from the VMWare web site; once you get used to the site, it is quite sensibly laid out, but to start with, it can be a bit overwhelming. When I checked, there is a particular set of downloads to match the version of VMWare that we had purchased, and this was where I should have got the software from. So I downloaded that version; and yes, it installed straightaway.
So far, so good; I had the hosts running, the SAN was available and with the VCenter Server software installed, I could see all of the hosts and start to do some more detailed work. Unfortunately, we have a number of projects on the go at the moment, so I was involved in another one for a few days before I could get back to playing with the VMs.
When I did get back to the virtual platform, I wanted to make the storage on the SAN unit available. I was able to initiate the iSCSI connectors and these showed the disk allocation on the SAN unit. However, these were not available to the VMs; it needs a check box to be ticked for this to happen.
Later on, I realised that we still had an issue; although the storage area was available to VMs on the one host, it wasn't available to the others. Further checking revealed yet another setting (this time on the SAN itself) that needed to be checked, and as soon as this was done, each of the hosts could see all of the storage areas.
Unfortunately, I got this resolved after I had created the first VM and installed the VCenter Server. This means that the image and the virtual disk are actually stored in a local drive on the host server, which is not quite what was planned. It appears that this can't be moved using the VMotion process; but I may be able to get around this by using the P2V function at a later stage. If this works I'll write another piece about that later.
So at this point we had all of the hardware installed, all of the software licensed and running, our first VM created and some templates ready for future use. We can now manage the systems and have experiemented with copying, snapshotting, moving using the VMotion process, modifying resource allocation on VMs and deleting the various unwanted bits. It has taken a bit of time, but now there is a good level of confidence in the product and we are comfortable that we can move to the next level. And there will be more on that next time.
This post follows on from a previous item on virtualisation. We had installed the hardware, then the ESXi software - now to start getting serious.
ESXi does have a console to set-up certain key items, but these are very limited. Essentially, it allows you to change hostname, set IP addressing and some security, not much else. To manage the host machines, you have to use another piece of software; the VSphere Client which runs from a PC. I already had a copy of this installed on my laptop, from the tests that I had run earlier in the year. However, I decided to get the latest version so that we could start as we mean to continue.
The update went through quite quickly and after about 15 minutes, I had the logon dialog box. Put in the correct IP address and logon to the host; except that it came up with an "invalid user name or password" message. I checked the details and they were correct. I double checked the details; domain, username, password. They were definitely all correct. After having stared at this for a few minutes, I then realised that the host installation had used a US keyboard layout and I was inputting the details using a UK layout keyboard. When I re-entered the same details using the US layout, it let me access the host. And it appears that there is no UK layout option available on the host installation routine.
Looking at the details of the host, I could creat VMs and allocate resources; but this wouldn't allow me to manage the other hosts. To do this, I had to install the VCenter Server product and use to do all the management. The idea was that this would be installed on the first VM, but when I tried to install the software, it produced an error stating that it was not possible to install the server software on a VM. This made no sense; the material that I had received all indicated that the best practice would be to install the VCenter Server on a VM.
After some analysis, the solution became obvious; I had the wrong version of VCenter Server. I had downloaded it from the VMWare web site; once you get used to the site, it is quite sensibly laid out, but to start with, it can be a bit overwhelming. When I checked, there is a particular set of downloads to match the version of VMWare that we had purchased, and this was where I should have got the software from. So I downloaded that version; and yes, it installed straightaway.
So far, so good; I had the hosts running, the SAN was available and with the VCenter Server software installed, I could see all of the hosts and start to do some more detailed work. Unfortunately, we have a number of projects on the go at the moment, so I was involved in another one for a few days before I could get back to playing with the VMs.
When I did get back to the virtual platform, I wanted to make the storage on the SAN unit available. I was able to initiate the iSCSI connectors and these showed the disk allocation on the SAN unit. However, these were not available to the VMs; it needs a check box to be ticked for this to happen.
Later on, I realised that we still had an issue; although the storage area was available to VMs on the one host, it wasn't available to the others. Further checking revealed yet another setting (this time on the SAN itself) that needed to be checked, and as soon as this was done, each of the hosts could see all of the storage areas.
Unfortunately, I got this resolved after I had created the first VM and installed the VCenter Server. This means that the image and the virtual disk are actually stored in a local drive on the host server, which is not quite what was planned. It appears that this can't be moved using the VMotion process; but I may be able to get around this by using the P2V function at a later stage. If this works I'll write another piece about that later.
So at this point we had all of the hardware installed, all of the software licensed and running, our first VM created and some templates ready for future use. We can now manage the systems and have experiemented with copying, snapshotting, moving using the VMotion process, modifying resource allocation on VMs and deleting the various unwanted bits. It has taken a bit of time, but now there is a good level of confidence in the product and we are comfortable that we can move to the next level. And there will be more on that next time.
Wednesday, 15 December 2010
BCS - Retro computing
Most IT staff work in fairly small groups; even in the larger companies, teams break down into groups of just a few people. As a result, it's easy for people to develop a "silo" mentality, and forget that there is a larger world out there.
For that reason, I like to try to get to various events where there is an opportunity to speak to others within the profession. It's really useful to be able to share ideas, talk about common problems, to know that there are other people that have exactly the same pressures on them and all too often, the same feeling that their work is not appreciated.
The BCS in the South West organise a number of events throughout the year, although there tend to be more during the Winter and Spring terms. During the Summer months, most of the organisers are busy with educational exam systems as they tend to be in academia.
The latest event at the University of Plymouth was a talk on "Retro computing"; a look back at some of the hardware and software systems of the last half century. It was quite amazing to recall the changes that have occurred over that time, to see once again the boxes that seemed so modern and powerful at the time.
They had an amount of older equipment on display, items that have been picked up over the years and kept to be part of a "museum of computing". People had the opportunity to use a few of these old devices; it was quite interesting to be able to once again play a game of Lemmings on the old Amiga.
However, it wasn't just about games; they had some emulation software there that showed how some of the older systems used to run and what kind of business systems were running on them. As someone who had once had the opportunity to create a program from scratch, by designing the flow chart then creating the commands on a series of large punch cards to be processed on the main frame at County Hall, I had a strange sense of nostalgia.
For some of those there, most of the hardware was beyond their recall; several students were actually younger than some of the exhibits, which is quite a scary thought! It just makes me wonder if my nice new shiny HP laptop will seem as ancient and irrelevant in another 20 years.
The BCS South West are also starting a new web site to act as a repository for some information on older computing. The site is there but nothing is available just yet (http://retrocomputing.org). I'm told that they intend to slowly build this up with the help of a few volunteers in the months to come.
In all, it was a really interesting evening, with a lot to see and do. It was also amusing to see who were the highest scorers in "Crazy Taxi"! Clearly there were a lot of the people with grey in their hair that had spent just as much time playing games as some of the younger generation.
For that reason, I like to try to get to various events where there is an opportunity to speak to others within the profession. It's really useful to be able to share ideas, talk about common problems, to know that there are other people that have exactly the same pressures on them and all too often, the same feeling that their work is not appreciated.
The BCS in the South West organise a number of events throughout the year, although there tend to be more during the Winter and Spring terms. During the Summer months, most of the organisers are busy with educational exam systems as they tend to be in academia.
The latest event at the University of Plymouth was a talk on "Retro computing"; a look back at some of the hardware and software systems of the last half century. It was quite amazing to recall the changes that have occurred over that time, to see once again the boxes that seemed so modern and powerful at the time.
They had an amount of older equipment on display, items that have been picked up over the years and kept to be part of a "museum of computing". People had the opportunity to use a few of these old devices; it was quite interesting to be able to once again play a game of Lemmings on the old Amiga.
However, it wasn't just about games; they had some emulation software there that showed how some of the older systems used to run and what kind of business systems were running on them. As someone who had once had the opportunity to create a program from scratch, by designing the flow chart then creating the commands on a series of large punch cards to be processed on the main frame at County Hall, I had a strange sense of nostalgia.
For some of those there, most of the hardware was beyond their recall; several students were actually younger than some of the exhibits, which is quite a scary thought! It just makes me wonder if my nice new shiny HP laptop will seem as ancient and irrelevant in another 20 years.
The BCS South West are also starting a new web site to act as a repository for some information on older computing. The site is there but nothing is available just yet (http://retrocomputing.org). I'm told that they intend to slowly build this up with the help of a few volunteers in the months to come.
In all, it was a really interesting evening, with a lot to see and do. It was also amusing to see who were the highest scorers in "Crazy Taxi"! Clearly there were a lot of the people with grey in their hair that had spent just as much time playing games as some of the younger generation.
Saturday, 11 December 2010
Sec-1 Penetration Workshop
On Friday 10th I went to a workshop event held in Bristol. It was organised by Sec-1 a specialist security firm http://www.sec-1.com/ - note the correct address, if you get it wrong you end up at a completely different type of business!
Obviously, these events are to promote the company and their services; however, it wasn't just a massive sales pitch. The main purpose was to offer people advice about maintaining good security practice by illustrating just how easy it is to break into systems and highlighting the reasons why.
The speaker was Gary O'Leary-Steele and he spoke with passion, conviction and great deal of knowledge. He indicated that they have carried out many investigation tests over the years, and in most cases they could use the same report over and again, but just change the name of the organisation. This is particularly the case in the 150 NHS trusts they have investigated, but is also often true of many private sector businesses.
He stated that in many cases, people have failed to adequately install patches which have been issued for specific problems, often long after the issue has been identified. As it happens, I did a quick search on MS06-040 & MS08-067, the two main culprits and the autocomplete worked in each case after just the first 4 characters, the problem is so well known.
He went on to discuss some of the most common problems and illustrated how they could be used to access systems. He also went on to demonstrate how easy it can be to identify vulnerable systems, get access to accounts with innappropriate levels of security permission, crack passwords and elevate permissions. In most cases, the team of testers expect to get access within 30 mins - if they take longer than an hour, the others tease them unmercifully!
Most of the tools that they use are available quite freely on the Internet. In some cases, they do use items that have been commercially written and there is a small charge, but generally those ones are for the real high end stuff. Each has their own favourites in much the way that people do with most other kinds of software.
Whilst going through the potential problems, Gary also indicated some of the possible solutions, often by using the software tools to confirm the problem, then implementing suitable practice or policy to ensure that something is done to minimise the problem or reduce the impact.
It should also be identified that many of the exploits that were identified were in Microsoft OS or software; but the speaker also very carefully highlighted that issues are just as prevalent in other software products. Mac, Linux, Adobe etc, were all shown to be just as insecure. In many cases, this was due to installation or configuration, but equally there were many flaws straight out of the box.
I'm not a security specialist, although I have had some training in this area. I also enjoy some of the work involved, although it has to be said I don't think that I have the necessary skills to make this my specialism. However, I think that I know enough to be able to state that there are a lot of people that suffer with "delusions of adequacy"; they think that because they use a particular product, or do a specific thing, that makes them invulnerable. Often, they are so wrong that it is difficult to know how to take them seriously in anything.
I'm going to say that it was a great day, a really useful workshop and I was very impressed by the whole event. If they organise any more (and I'm told they certainly hope to) I would very strongly suggest that you grab the opportunity to get along and take advantage of the information and advice that they are willing to hand out free of charge.
Obviously, these events are to promote the company and their services; however, it wasn't just a massive sales pitch. The main purpose was to offer people advice about maintaining good security practice by illustrating just how easy it is to break into systems and highlighting the reasons why.
The speaker was Gary O'Leary-Steele and he spoke with passion, conviction and great deal of knowledge. He indicated that they have carried out many investigation tests over the years, and in most cases they could use the same report over and again, but just change the name of the organisation. This is particularly the case in the 150 NHS trusts they have investigated, but is also often true of many private sector businesses.
He stated that in many cases, people have failed to adequately install patches which have been issued for specific problems, often long after the issue has been identified. As it happens, I did a quick search on MS06-040 & MS08-067, the two main culprits and the autocomplete worked in each case after just the first 4 characters, the problem is so well known.
He went on to discuss some of the most common problems and illustrated how they could be used to access systems. He also went on to demonstrate how easy it can be to identify vulnerable systems, get access to accounts with innappropriate levels of security permission, crack passwords and elevate permissions. In most cases, the team of testers expect to get access within 30 mins - if they take longer than an hour, the others tease them unmercifully!
Most of the tools that they use are available quite freely on the Internet. In some cases, they do use items that have been commercially written and there is a small charge, but generally those ones are for the real high end stuff. Each has their own favourites in much the way that people do with most other kinds of software.
Whilst going through the potential problems, Gary also indicated some of the possible solutions, often by using the software tools to confirm the problem, then implementing suitable practice or policy to ensure that something is done to minimise the problem or reduce the impact.
It should also be identified that many of the exploits that were identified were in Microsoft OS or software; but the speaker also very carefully highlighted that issues are just as prevalent in other software products. Mac, Linux, Adobe etc, were all shown to be just as insecure. In many cases, this was due to installation or configuration, but equally there were many flaws straight out of the box.
I'm not a security specialist, although I have had some training in this area. I also enjoy some of the work involved, although it has to be said I don't think that I have the necessary skills to make this my specialism. However, I think that I know enough to be able to state that there are a lot of people that suffer with "delusions of adequacy"; they think that because they use a particular product, or do a specific thing, that makes them invulnerable. Often, they are so wrong that it is difficult to know how to take them seriously in anything.
I'm going to say that it was a great day, a really useful workshop and I was very impressed by the whole event. If they organise any more (and I'm told they certainly hope to) I would very strongly suggest that you grab the opportunity to get along and take advantage of the information and advice that they are willing to hand out free of charge.
Subscribe to:
Posts (Atom)