People are social animals for the most part. We love to communicate with one another, share information about what we and people we know are up to – and it’s been suggested that this is why human beings developed the capacity for speech. In years gone by, people wrote letters to one another, or if urgent would send telegrams. Later, the telephone allowed actual real time conversations between people and that has lead to the growth of the more modern methods.
With the growth of the Internet, different methods of communicating have been developed and are used by people to enhance the way that they converse. And this has created a major problem; many workers want access to these new methods of communicating – instant messaging, blogs, wikis, social networking, video streaming and photo sharing sites. This increases the amount of data being transmitted and stored, which also increases the pressure on resources, and adds to the possibility of security issues.
Now many will maintain that these new methods provide significant benefits to the modern organisation – arguments are put forward such as “providing new revenue streams”, “improving marketing opportunities”, “ensuring real-time communications” – all the usual buzz words that you get from the people trying to persuade you that this is the way forward.
I like to think that I am quite open minded about most technologies, and I can see that there is a lot to be gained on a personal level from the use of these products. I can even see a number of practical applications within a business environment and have planned some projects to explore some of these. However, I do have a number of concerns relating to the security of these systems and how much time people will spend on them.
For example, being cynical I know that most data losses are caused by internal staff, not by outsiders hacking into the systems. Most companies are extremely protective of their data; but the social networking facilities can make it very easy for this to be copied and moved.
There is also the possibility that these systems could provide a route in for malware to be loaded. A user wants to install a new “Tool bar” application they see advertised on an IM message and click to install, not realising that what they are really doing is installed a keystroke logger.
Of course, there is also the concern that the staff may spend more of their working day actually chatting or posting items online rather than doing the job that they are supposed to be doing (I’m doing this in my lunch break!). And we have all heard the stories of embarassment of people posting comments in emails or on Facebook that are then sent around the world. These can cause an organisation to lose business and come to haunt a company for many years after the original event.
I think that there is a place in work for some of these tools – if we can teach people how to use them properly. But we have to make sure that they are being used appropriately and that we have a set policy so that all staff know where they stand, and we have to make sure that we can enforce these. I doubt that we can completely block their use, but I think it appropriate to try to set some ground rules so that we can at least try to make sure that they are not being used inappropriately.
What do you think?
Monday, 24 August 2009
Tuesday, 11 August 2009
The battle of the Vs – VMWare vs Hyper V
Last week, I and one of the guys in the department had the chance to attend an event that demoed VMWare 4 and Windows Server 2008 R2 with Hyper-V. Many thanks to the people at Nexus in Exeter (http://www.nexusopensystems.co.uk/) that hosted the event, especially Gary that did the demos.
The presentations were quite straight forward. The first session was VMWare and we were shown the installation process starting right from the bare metal server. The actual installation process was Linux based and took about 20 minutes on releatively low spec machines. Once it was all up and running, we had the chance to see some virtual servers being created – literally just a matter of a few minutes work. We also discussed the switching process and the various options, and briefly saw how to create virtual switches.
There was a bit of a discussion about the merits of the VMWare product – how it allows you to “overload” by selecting options for the virtual servers such as levels of RAM that total up to more than the physical amount actually available. I would want to check this out for myself, but it certainly seemed to run OK.
We then discussed clustering and resiliance and the demo that followed showed a high definition media file being moved from one virtual server to another – the file ran constantly during the move and there was not even a slight pause during the process. Really impressive! Certainly, this would be of significant value in a situation where you are having to move production data when people are still working on it.
The demo actually ran over a bit as we were really interested in the product and had several questions to ask about various aspects – and Gary was only too pleased to show us the various bits in response. There is no question that it is an awesome product.
We then had the chance to see Hyper-V in action and for me it was the first chance I’ve had to look at this. We have Windows Server 2008, but not the R2 version which contains the hypervisor. The main difference between the two is that the VMWare hypervisor sits above the hardware and handles all of the driver requirements. Hyper V sits at the same level as the OS, just above the hardware, but each virtual server will handle it’s own drivers seperately. It also doesn’t allow overloading of resources – once you hit the limit, that’s it.
From what we saw, the Hyper V runs well – certainly it provided a smooth experience whilst we were watching it and the test moving the media file ran pretty much the same. There were a few diffences in the way that the virtual networking operates, but certainly it seems to run as we had expected. It definitely doesn’t have all the functionality of VMWare, but then there is a price difference – it’s a lot cheaper.
I’ve been looking at this now for a few months (in between other jobs) and I’m convinced that virtualisation is the way to go. It will certainly cut costs in terms of the electric bill, and it will also fit very nicely into our backup process / business continuity / disaster recovery planning. About half of our servers will reach 5 years old next year, so it seems a good time to start planning a move over to a virtualised system.
We have had a couple of visits to different vendor demos and they have been really useful. Although nothing has been decided, we are leaning towards the Dell Equalogic equipment – it seems to be everything that we could want and a bit more. The big issue of course is what software to run on the servers which is why we wanted to get to the event in Exeter. Howevever, I still not sure which one I think is the best option for us.
I’ve therefore planned that in the new year, say Jan / Feb 2010, we will get ourselves a spare server – there are plenty of cheap machines around at the moment. There is a trial version of VMWare available and of course, the Technet subscription allows us to install an evaluation copy of Server 2008 R2. Hopefully, this will gives us the chance to actually work with both products so that we can get a really good idea of which one we prefer – all we then have to do then is sell it to the powers that be!
The presentations were quite straight forward. The first session was VMWare and we were shown the installation process starting right from the bare metal server. The actual installation process was Linux based and took about 20 minutes on releatively low spec machines. Once it was all up and running, we had the chance to see some virtual servers being created – literally just a matter of a few minutes work. We also discussed the switching process and the various options, and briefly saw how to create virtual switches.
There was a bit of a discussion about the merits of the VMWare product – how it allows you to “overload” by selecting options for the virtual servers such as levels of RAM that total up to more than the physical amount actually available. I would want to check this out for myself, but it certainly seemed to run OK.
We then discussed clustering and resiliance and the demo that followed showed a high definition media file being moved from one virtual server to another – the file ran constantly during the move and there was not even a slight pause during the process. Really impressive! Certainly, this would be of significant value in a situation where you are having to move production data when people are still working on it.
The demo actually ran over a bit as we were really interested in the product and had several questions to ask about various aspects – and Gary was only too pleased to show us the various bits in response. There is no question that it is an awesome product.
We then had the chance to see Hyper-V in action and for me it was the first chance I’ve had to look at this. We have Windows Server 2008, but not the R2 version which contains the hypervisor. The main difference between the two is that the VMWare hypervisor sits above the hardware and handles all of the driver requirements. Hyper V sits at the same level as the OS, just above the hardware, but each virtual server will handle it’s own drivers seperately. It also doesn’t allow overloading of resources – once you hit the limit, that’s it.
From what we saw, the Hyper V runs well – certainly it provided a smooth experience whilst we were watching it and the test moving the media file ran pretty much the same. There were a few diffences in the way that the virtual networking operates, but certainly it seems to run as we had expected. It definitely doesn’t have all the functionality of VMWare, but then there is a price difference – it’s a lot cheaper.
I’ve been looking at this now for a few months (in between other jobs) and I’m convinced that virtualisation is the way to go. It will certainly cut costs in terms of the electric bill, and it will also fit very nicely into our backup process / business continuity / disaster recovery planning. About half of our servers will reach 5 years old next year, so it seems a good time to start planning a move over to a virtualised system.
We have had a couple of visits to different vendor demos and they have been really useful. Although nothing has been decided, we are leaning towards the Dell Equalogic equipment – it seems to be everything that we could want and a bit more. The big issue of course is what software to run on the servers which is why we wanted to get to the event in Exeter. Howevever, I still not sure which one I think is the best option for us.
I’ve therefore planned that in the new year, say Jan / Feb 2010, we will get ourselves a spare server – there are plenty of cheap machines around at the moment. There is a trial version of VMWare available and of course, the Technet subscription allows us to install an evaluation copy of Server 2008 R2. Hopefully, this will gives us the chance to actually work with both products so that we can get a really good idea of which one we prefer – all we then have to do then is sell it to the powers that be!
Wednesday, 5 August 2009
Terminal headaches
We have been trying to implement some new software for the CRM – the product has been used by one of our sites for some time, but not on the other sites. They had tried to use it before, but it’s not designed to be used across a WAN, so it had been set-up as multiple databases and when they started getting issues, they just stopped using the product.
The company concerned have issued a new version and our sales people have seen it and really like it. The vendor has produced a modified client GUI to run in a web browser – the idea is that those users on the remote sites would make use of that and so we could run a single database for all sites.
Well, that WAS the idea – the software runs OK locally, but when it was running through the browser, it was not as fast. Although it was usable, there was a definite speed issue, and we were worried that the users on the other site might not be convinced enough to use the product if the speed was poor.
It then occurred to me – the database was installed on the server and we also had a copy of the client software installed on the server so that we could test it was running as it was being set-up. I did a quick RDP to a server on the other site, the from there did another RDP back to the server on our site. The speed of operation was good – as far as I could tell, the speed was the same as if we were running it directly at this site.
So I set-up some shortcuts and emailed them to the users at the remote site, and then talked them through how to save and use the shortcut. They agreed that this worked well and they were really happy with the speed of operation. But then we hit a snag – only 2 users at a time. As we are talking about having some 20 remote users, then there is clearly a bit of a problem.
Now my predecessor had bought volume licences for a lot of software which included some terminal server licences, but unfortunately, none of the paperwork specified what was what. I found the paperwork ages ago and set-up a profile on eOpen to manage all of the various items. https://eopen.microsoft.com/EN/default.asp - this is a great resource and I suggest that you check it out if you don’t already use it. It allows you to see what the various bits of paper refer to and it gives you details on date of purchase, vendor, type of licence, quantity etc.
However, when I double checked, the Terminal Services licence server had been setup and the licences applied – so that wasn’t the problem. I then searched through the various bits and pieces and subesquently realised where it was all going wrong. The server that the software was installed on was set to use Remote Desktop as the licensing mode, not the correct Terminal Services mode. A quick couple of clicks and problem solved.
So now the staff at the remote site can all connect to the server and all use the CRM software. It seems to run just as quickly when half a dozen of them are using it – so they are all happy!
The company concerned have issued a new version and our sales people have seen it and really like it. The vendor has produced a modified client GUI to run in a web browser – the idea is that those users on the remote sites would make use of that and so we could run a single database for all sites.
Well, that WAS the idea – the software runs OK locally, but when it was running through the browser, it was not as fast. Although it was usable, there was a definite speed issue, and we were worried that the users on the other site might not be convinced enough to use the product if the speed was poor.
It then occurred to me – the database was installed on the server and we also had a copy of the client software installed on the server so that we could test it was running as it was being set-up. I did a quick RDP to a server on the other site, the from there did another RDP back to the server on our site. The speed of operation was good – as far as I could tell, the speed was the same as if we were running it directly at this site.
So I set-up some shortcuts and emailed them to the users at the remote site, and then talked them through how to save and use the shortcut. They agreed that this worked well and they were really happy with the speed of operation. But then we hit a snag – only 2 users at a time. As we are talking about having some 20 remote users, then there is clearly a bit of a problem.
Now my predecessor had bought volume licences for a lot of software which included some terminal server licences, but unfortunately, none of the paperwork specified what was what. I found the paperwork ages ago and set-up a profile on eOpen to manage all of the various items. https://eopen.microsoft.com/EN/default.asp - this is a great resource and I suggest that you check it out if you don’t already use it. It allows you to see what the various bits of paper refer to and it gives you details on date of purchase, vendor, type of licence, quantity etc.
However, when I double checked, the Terminal Services licence server had been setup and the licences applied – so that wasn’t the problem. I then searched through the various bits and pieces and subesquently realised where it was all going wrong. The server that the software was installed on was set to use Remote Desktop as the licensing mode, not the correct Terminal Services mode. A quick couple of clicks and problem solved.
So now the staff at the remote site can all connect to the server and all use the CRM software. It seems to run just as quickly when half a dozen of them are using it – so they are all happy!
Subscribe to:
Posts (Atom)