Industry news

  • 19 May 2011 12:00 AM | Anonymous

    ClickSoftware Technologies Ltd., the leading provider of automated workforce management and optimization solutions for the service industry, has announced that the UK's largest water and wastewater services company, Thames Water, has selected the ClickSoftware ServiceOptimisation Suite.

    Thames Water identified ClickSoftware’s workforce management and mobility platform as the key technologies to ensure significant improvements in the efficient use of its workforce, its assets and the quality of service that it provides to its customers through optimized scheduling of its 2,000-strong field force and contractors.

    Thames Water is under significant, industry-wide pressure from the Water Services Regulation Authority (OFWAT) to deliver value for its 13.8 million customers’ money during the current five-year Asset Management Planning (AMP) period. The most effective way to comply with OFWAT whilst improving customer service levels is by optimizing the activities of its 2,000 field-based staff.

    “Workforce management technology will give us the edge in maximizing the performance of our field operations and, at the same time, help to reduce our back office costs and meet our regulatory commitments,” said Andrew Bilecki, Chief Information Officer at Thames Water.

    “Following a rigorous European tender process, we selected ClickSoftware as it is the best fit for our functional and technical requirements. Given the rapid mobility advancements over the last couple of years, we wanted to ensure we had the best mobile solution in order to mobilize SAP processes into the field and support our workforce with the right data at the right time,” Bilecki added.

  • 19 May 2011 12:00 AM | Anonymous

    BMC Software, Eucalyptus Systems, HP, IBM, Intel, Red Hat, Inc. and SUSE have announced the formation of the Open Virtualization Alliance, a consortium committed to fostering the adoption of open virtualization technologies including Kernel-based Virtual Machine (KVM). The consortium will promote examples of customer successes, encourage interoperability and accelerate the expansion of the ecosystem of third party solutions around KVM, providing businesses improved choice, performance and price for virtualization.

    The Open Virtualization Alliance will provide education, best practices and technical advice to help businesses understand and evaluate their virtualization options. The consortium complements the existing open source communities managing the development of the KVM hypervisor and associated management capabilities, which are rapidly driving technology innovations for customers virtualizing both Linux and Windows® applications.

    KVM virtualization provides compelling performance, scalability and security for today's applications, smoothing the path from single system deployments to large-scale cloud computing. As a core component in the Linux kernel, KVM leverages hardware virtualization support built into Intel and AMD processors, providing a robust, efficient environment for hosting Linux and Windows virtual machines. KVM naturally leverages the rapid innovation of the Linux kernel (to virtualize both Linux and Windows guests), automatically benefiting from scheduler, memory management, power management, device driver and other features being produced by the thousands of developers in the Linux community.

    Members of the Open Virtualization Alliance have a common interest in supporting open virtualization, and are involved in the development, distribution, support, use, or other business interest in KVM or offerings which use it. By providing an open virtualization alternative, they are offering their clients choice and enabling them to select the ideal virtualization products for their business needs.

  • 19 May 2011 12:00 AM | Anonymous

    The European Commission wants to gather views on how to exploit cloud computing in business

    The European Commission is seeking to gather views on the cloud to help it build its own cloud computing strategy.

    Interested parties are invited to contribute to the EC’s online public consultation, and have until 31 August 2011 to provide their input.

    The EU executive believes that cloud computing could generate 35 billion euros (£30 billion) in revenue in Europe by 2014. It also feels that the right regulations could help business and governments make considerable savings on their costs by using the technology.

  • 19 May 2011 12:00 AM | Anonymous

    IPsoft research provides insight on how automation technologies can help IT departments become more proactive and aligned to business objectives

    IPsoft, a provider of autonomic IT management services, today reveals that over a quarter (26%) of IT decision makers (ITDMs) spend the equivalent of two days a week on mundane IT management. This could be wasting British businesses up to £18.9 billion in lost hours, based on the average IT manager salary*.

    The research of over 200 senior ITDMs also uncovers that 54% believe automation saves money and frees-up time for more proactive activity, critically important with more than half admitting that they spend less than two days a week on innovation.

    Terry Walby, UK Managing Director, IPsoft comments: “Having skilled IT staff spend, an excessive amount of time on basic maintenance tasks is potentially demoralising. It not only creates poor productivity and wastes money, but it can also impact retention. A firm with its best people spending days each week on routine work will eventually either lose the momentum for proactive projects, lose the people, or perhaps lose both. By redeploying IT staff in a more creative direction, firms can re-focus their energy on the most important corporate goals.”

    Additional findings from the survey include:

    IT Operations Drain Resources

    More than half of respondents state that they would be able to embark on new strategic projects if their IT operations were automated, highlighting that innovation is a big priority for ITDMs. As well as time for strategic projects, 43% recognise that they would have greater time for planning and defining the organisation’s strategy, ensuring better alignment with the business as a whole.

    Error Resolution Top Priority

    Interestingly, whilst the debugging and fixing of systems is often cited as an area of enjoyment for IT personnel, the results seem to suggest otherwise. The detection, diagnosis and fixing of errors are ranked as the biggest priority for automation (32% ranked it as top, 49% ranked it in top three), however, just detection and diagnosis scored much lower (6% ranked it as top, 31% ranked it in top three), demonstrating that ITDMs are looking for technology to self-heal and fix problems, not just detect them.

    Automation Confuses Corporates

    Respondents clearly understand how automation can benefit the organisation; but there is a lack of knowledge of what is already possible. 1 in 10 believe that it isn’t possible for technology to repair itself without human intervention, however, 66% agree that technology can learn and adapt to user needs over time. In addition, over a third (35%) seem concerned about the idea of artificial intelligence and self-learning systems, with a quarter refusing to accept that it’s not pure science fiction!

    Terry Walby, Managing Director at IPsoft concludes, “Whilst there is still some confusion in the market as to what automation can already achieve for businesses, the IT departments that choose to adopt the technology will be able to deliver significant benefits back to the business. Automation has moved on from just being complicated scripts on a local network to a sophisticated self-learning and self-healing system delivered through expert computing systems.

  • 19 May 2011 12:00 AM | Anonymous

    Now that we are emerging from the economic down turn, corporations are shifting emphasis from maintaining position to focussing on achieving sustainable profitable growth and seeking ways to scale and replicate successful business processes globally. Process benchmarking is one of the effective ways of identifying the best processes to replicate or highlighting those which need improvement.

    Benchmarking provides an organisation with a comparative view of their own business processes against the best in class standard, allowing them to manage processes optimisation.

    Benchmarking can only add value if the right metrics are chosen, these should be the parameters which truly impact overall business outcomes. Looking at a business process end to end from the perspective of these metrics, across functions will enable discovery of the right levers to achieve the strategic goals.

    For example, the key business outcome measures for any source to pay processes are total cost of ownership (TCO), working capital optimisation and material availability. These can be directly linked to more fine grained Level 2 and Level 3 key performance measures and drivers such as spend through preferred suppliers, requisition to PO cycle time, cost of PO Processing, percentage of electronic POS and days payable outstanding. If a business unit is monitoring the right Level 2 and Level 3 metrics, it will be able to quickly identify gaps in process performance, the consequences of these gaps and be able to take steps to fix the processes and hence influence the key business outcome.

    Once appropriate metrics are identified, finding the right benchmark data for intra and cross industry comparison is vital to help fix improvement targets. Non profit or academic organisations like AQPC and companies involved in managing large scale global business processes across verticals can be a good source of benchmark data. Many organisations have myopic view, choosing only to use industry specific benchmarks. This is usually an error, as it can prevent them from identifying and leveraging applicable process best practices from outside their sector.

    This is where developing an enterprise process view could be helpful, for example over 60% of indirect procurement categories remain the same for most organisations, and hence their source to pay processes remain similar, so using best in class measurement irrespective of industry vertical is a rational way of using available benchmark data.

    Knowing where your process ranks and where you need to be is the important part of benchmarking, the rest of the journey becomes tactical. Whether a company is leader, laggard or median on a benchmark scale, setting a realistic target for process change is critical to drive improvements and optimally utilise the resources available.

    A regular process benchmarking programme can be an effective way to improve the overall process ecosystem, linking and breaking silos between business units, regions, countries and service providers to help make the business’ processes scalable and nimble in response to external change.

  • 19 May 2011 12:00 AM | Anonymous

    When it comes to managing your company’s IT systems, it’s vital to consider the legacy technologies you have in house as part of the IT strategy and management plan. ‘Legacy technologies’ can refer to any systems or applications that have been in existence for some time, and are either becoming hard to support due to dwindling skills or discontinued support from the vendor, or are no longer core to the company’s operation but do still need to be retained.

    Legacy systems pose potential risks to your IT operation and business continuity for a number of reasons. First, it creates extra burden to your day-to-day IT operation. As a result, resources can’t be freed up to focus on more strategic, value-added initiatives. Second, the need to maintain and work with these legacy systems will lead to an increasing dependency on the limited number of staff with these skills. It increases risk at the operational level due to the reduced talent pool and support.

    Imagine a scenario where a bespoke yet business-critical application for Company X was developed in RPG on the IBM AS/400 platform in the 1990s. As time moves on, there is only one person in the IT department – who is just about to retire –that is capable of maintaining and updating that system. While the application still meets a particular business need, the evolution of technology means that the external and internal resources available to maintain this system have decreased. Failing to address this situation is a risk to the business.

    Therefore, actions MUST be taken with regard to your legacy technology. Don’t let it become a stumbling block to your ability to address business priorities.

    There are three common options for managing legacy systems/applications:

    1. You can migrate to an entirely new technology (i.e. a new package)

    2. You can continue to develop your existing technology (i.e. modifying the code) using external providers if the in-house resource is not available

    3. You can outsource the management of your legacy technology (which can include the ongoing development of the application by the outsourcer)

    Each option has different advantages and disadvantages that you need to consider. Let’s start examining these options one by one.

    First, if you decide to migrate to an entirely new system or application, you could expect the main benefits to include access to new (and possiblymore widely used) technology with greater functionality. This option can possibly give you the opportunity to update your key business processes accordingly with the roll-out. However, the amount of work involved when moving a new package can be quite large and complex and it does involve some financial, operational and resourcing risks.

    Moreover, you’ll also need to make sure that the new technology ‘fits’ well with the rest of the business, especially if you’re planning to implement a packaged solution, and if it address your specific requirement. If you think that you’d rather build a brand new customised solution, you still need to be aware that there are risks here too, as your new bespoke system/application could very quickly become outdated and a legacy itself, and therefore spark a costly(and endless) cycle of product development and obsolescence.

    As such, many companies may prefer to continue develop their existing technologies instead of replacing them completely. The second option can be achieved by updating the necessary computing code and/or by developing a web-based wrapper that that can be used to enhance areas such as the user interface.

    Unlike a full-scale replacement, this approach is appealing in that it is incremental, and therefore reduces the initial time and financial commitment required. At the same time, it will help to prolong the life of your legacy system whilst also reducing many of the risks and problems associated it. However, in order to achieve these benefits, you will need to plan the project very carefully and make sure that you have all the skills and resources needed to manage such a complex project.

    In many cases, the third option – outsourcing the management of the legacy systems– can provide the most immediate and straight forward answer to the challenge. Remember that a well-qualified outsourcing provider operates with a rich pool of resources to give you the full support that you need and without the constraint you have in house (such as staff going on leave or deployed to other projects). Outsourcing the management of the legacy systems/applications will free up your IT department to focus on more strategic initiatives while ensuring that you have the consistent and reliable service you expect.

    Most important of all, if your longer term solution is inevitably to migrate to a new technology or to continue develop it, outsourcing gives you the time to evaluate these options carefully while mitigating the risk elements associated with the legacy systems/applications.

    Whatever your decision is, you will need to consider very carefully whether the solution you choose is going to address the business requirement (at both a strategic and functional level) effectively, as well as other important factors, such as cost, timescale, service continuity, and the skills and resources that would require for the change.

    When weighing up all these factors, you may find that the decision to outsource the management of your legacy systems and applications will tick all of the most important boxes, especially as you’ll be gaining access to a rich pool of support and skills instead of relying on a limited number of staff to deliver continuous services. Plus, an outsourcing environment will give you the resources (and the breathing space) that you need to reduce your operating risk, so that you can evaluate the business case for your other options more thoroughly. There can’t be an easier solution to managing legacy technologies than outsourcing!

  • 19 May 2011 12:00 AM | Anonymous

    Traditionally, outsourcing application development has enabled organisations to reduce IT operating costs, tap into specialist skills and enable core, internal IT staff to focus on strategic tasks. The need for CIOs to capitalise on these benefits look set to continue, in fact a recent survey by Harvey Nash found that application development remains the most intensively used outsourced service by CIOs.

    Fast turnaround on application development projects is, of course, a key metric against which success is measured. It’s about getting the job done within budget and timescales and, for the client, cost is – and will continue to be – a principal driver. For the provider, delays not only mean lost revenue, but over-runs can also have a longer term impact on their reputation and competitiveness.

    Of course, taking an application from idea to deployment is not without risks and complexities so equipping teams with the tools that can enable fast completion and remove the complexity of projects can make a considerable difference in success or failure.

    What’s more, in this cloud driven, multi-device age, the ability to ‘future-proof’ applications is going to be an increasingly important success factor. When creating a new application, an organisation may not always know exactly which platform or device they could require in the future. It can be particularly difficult to predict what changes lie ahead in this era of mobility and cloud where on-premise applications may need to be broadened to allow different deployment modes such as SaaS, or taken to new platforms, for example deploying a mobile application from Windows to Android.

    New approaches are now helping development teams faced with such challenges, to simplify the code writing process, reduce development time and then enable an application to be distributed to a number of different deployment channels. These engines use pre-compiled and pre-configured business logic that contain pre-written coding functionality and services. An application platform results typically in fewer coding mistakes, faster project completion and the ability to adapt applications to a business’s changing needs. For example, it provides teams with the option of building and running a cloud application offering, in addition to a client‐server on‐premise deployment model from just one development process. The application can be repurposed at any time for a different channel without the need to re‐code the application entirely from scratch.

    The benefits of this work both ways - for the outsourcing partner this is about delivering choice to the client, enabling faster project turnaround and empowering customers with the options to take applications to whichever device and platform they may wish to with minimal impact on time and resources. The ability to offer this kind of differentiated service - whether it’s creating an application to help sales staff access customer data from their mobile phone or enabling field engineers to access stock information more quickly - could also be the route to reach a wider market and enable greater business opportunities, in the future.

  • 17 May 2011 12:00 AM | Anonymous

    It is understandable for in-house DBAs (database administrators) to view a remote database support service with some suspicion. Afterall, managing the day to day running of the database is their primary role. However, depending on the size of an organisation, DBAs are frequently being asked to take on more than just the traditional day to day management and monitoring of a database.

    In a survey by IDUG & CA in small organisations (1-1000 employees) 31% said they were primarily involved in database administration. This means that more than half are primarily involved in other activities which take them away from day to day administration and maintenance. DBAs, especially in smaller teams have huge responsibility as their workload increases and they are pulled into different projects whilst still having to cope with the day to day tasks.

    In many small to medium sized organisations there may be just one or two DBAs managing a mission critical system. Perhaps those DBAs would actually welcome an extra pair of hands. Would they object to having another full-time team member to help carry some of the load? Probably not, in fact another team member would likely be welcomed with open arms. So why shouldn’t this extra pair of hands be a virtual team member rather than another person in the office?

    For IT management there are compelling financial reasons for taking the decision to outsource database management. There’s a tipping point for most IT teams in small to medium sized organisations where they go from having just about enough resource who are just about coping, to a point where they absolutely have to make a change. When this point is reached it’s time to take the decision – hire a new employee or take out a managed service contract. Many organisations want to keep their existing talent and knowledge in-house, especially in the world of DB2 and database management, and so discount outsourcing.

    However, the cost and time of recruiting and retaining an experienced DBA can be prohibitive and so sometimes the status quo is maintained and an already stretched DBA team is stretched further. The thought of “outsourcing”, particularly in small to medium sized organisations can seem hugely daunting and can have people in fear for their jobs. It doesn’t have to be that way though.

    There are many advantages for in-house DBAs if their organisation chooses to utilise the services of a remote support partner rather than employ an extra member of staff, not least because they don’t have to be added to the coffee round! The DBA team can have the flexibility to pass off the areas of maintenance and administration which are most time consuming and labour intensive which will free them up to work on other projects. Or perhaps they simply want to get rid of the out of hours calls.

    A remote support solution can and should take the strain off the DBA team by looking after the day to day support issues and it is usually far cheaper to take out a database support package than to hire and retain an extra member of staff. The DBA team receive the extra pair of hands they need and IT Management get a cost effective solution for managing their critical databases.

  • 17 May 2011 12:00 AM | Anonymous

    The mainframe has been on a rollercoaster ride over the last 30 years; from hero to zero and now, back to hero again. In the late ‘90s the mainframe began to lose it’s “cool” new technology status to distributed platforms. Now the mainframe is coming back into fashion and rather than moving away from it, many organisations are building on its’ solid foundations.

    According to research by CA, mainframes still handle 55% of companies’ data; this rises to 59% for companies with more than 3000 staff. In the latest figures from IBM, System z revenues were up 69% year-on-year in the fourth quarter, so it seems that mainframe business is currently experiencing healthy growth.

    There have been many reports of a looming mainframe skills shortage. Is this just media hype or is there some truth behind the claims and if so what does this mean for UK business?

    At a recent IBM customer event on DB2 z/OS technologies one of the first questions which arose from the audience was about IBM’s plans to cope with the skills shortage which is on the horizon. There was general consensus from those in the room that UK businesses running IBM mainframes are concerned with how they are going to manage over the next ten years with so few new mainframe professionals entering the market.

    I put the question to a LinkedIn community of DB2 z/OS users and got some interesting responses including:

    “There is a big shortage of “Mainframe” skills, and it will get bigger.”

    “From my experience, I can say that there is definitely a shortage on the horizon...I would judge that there is going to be a major crisis in the financial services area.”

    Speaking to DB2 z/OS consultants in our own organisation I have seen similar sentiments expressed:

    “While I don’t think the issue has really started to bite on the DB2 side yet, you only need to look at slightly older technology such as IMS to see the pattern. Good IMS skills are in very short supply nowadays.” Julian Stuhler, Director, Triton Consulting

    What is causing the problem?

    There are varied reasons for the looming gap in mainframe skills. Many of the first crop of mainframe experts, who were the key technical heavyweights that put the mainframe where it is today are heading towards retirement age. In addition to that, many mainframe administrators were re-trained and re-deployed onto distributed systems.

    For the last 10 years we’ve heard that the mainframe is dead but the reality of the situation is that the mainframe, far from being dead is actually growing. However, this “death of the mainframe” rumour has lead trends which have meant that new entrants to the IT workplace have been concentrating their training and career paths onto distributed systems, leaving a gap that needs to be filled.

    “There are very few young people involved in zSeries – right across the skill base” James Gill, DB2 z/OS Consultant, Triton Consulting.

    Effect on business

    The banking and financial services sectors rely heavily on the power of IBM mainframe servers and so are likely to be amongst the hardest hit if mainframe skills begin to dwindle in the marketplace. With so much resting on the successful management of mainframes and the applications that run on them it will be vital over the coming years for the financial services and banking sectors to address potential staffing issues.

    66% of all respondents in the CA research agreed that the mainframe user will soon start to suffer, if it hasn’t already, from a shrinking workforce with the relevant skills not being readily available.

    It is also worth mentioning here how immigration legislation is affecting the UK job market, particularly in reference to technical skills. The option of bringing in skills from abroad has been made a lot more difficult with the introduction of a points based entry system. Organisations are having to go through much more bureaucratic systems and processes to secure work permits for potential employees.

    What are the big players doing to address the problem?

    IBM is not blind to the issue and is actively encouraging new blood into the mainframe world by working with universities to have mainframe modules included in undergraduate computing courses. There are currently 19 universities teaching System z topics. Sheffield University have been running a course as part of the undergraduate BSc degree program for the last three years with Liverpool John Moores University following suit in January 2012. The University of West Scotland is about to become the first Scottish university to run System Z courses starting in September this year.

    Cally Beck is the Academic Initiative Leader at IBM and says “I work with large enterprise clients to help them attract, find and retain young z skills, particularly on the development side. I know it is a serious issue right across the server platforms. I also get requests for Information management, DB2, Rational and Websphere skills. Although I see requests from regions outside Europe, by far the most comment comes from Europe, and in particular UK and Germany.”

    In terms of DB2, IBM have for some time been working hard to reduce the mainframe-specific skills necessary to manage a DB2 for z/OS environment and while there will always be a need for some people with deep knowledge of the platform it is also now possible to do many roles using GUI tools that are more familiar and comfortable for younger generations and need less mainframe knowledge. The increased role of autonomics is also playing a part in reducing some of the lower-level skills necessary.

    Another advocate of the mainframe is CA Technologies and they run their own Mainframe Academy in a bid to grow mainframe skills within the IT community. They are also running a scholarship project until 2016 to help drive take-up - http://www.ca.com/lpg/mainframe-academy.aspx

    How to manage the change

    Grow your own

    As we’ve seen above there is currently significant effort to increase the number of UK graduates entering the workforce with some level of System z expertise. One possible solution then is for organisations to take these young professionals and grow them into the experts of the future. This of course will take time, expense and effort but it will surely be worth the investment in years to come.

    Training

    Organisations need to be thinking about preparing themselves by looking internally at the skills they already have in-house. In a CA survey 42% of Financial Services organisations said they are currently looking at additional skills and training needs. Re-training and up-skilling already trained DBAs (Database Administrators) will ensure DB2 mainframe skills are retained within the organisation. This approach, backed up by bringing in external expertise where necessary could be a more cost effective option than growing a DBA from scratch, in the short-term.

    Outsourcing

    Wholesale outsourcing of mainframe services is certainly an option but it does bring with it some potential complications. Outsourcing all mainframe services to a third party means that the ingrained organisational knowledge of those currently managing the system can be lost. Although the outsourcing provider is no doubt highly skilled, they don’t have that intimate knowledge of the organisation which is built up over many years.

    A better option is “partial outsourcing” where specific areas of mainframe technology support are outsourced to a niche service provider. In this way the organisation keeps a certain level of in-house knowledge but can also have back-up where necessary from experts in the field. This partial outsourcing approach supports existing staff and can help bridge the gap when skills are in short supply in-house.

    All of the above

    As already discussed, both IBM and CA are putting huge amounts of money and effort into training the next generation of mainframe experts by running education initiatives in the UK. However, training university students takes time to filter through the system and even more time for organisations to train them up and give them sufficient levels of experience. By combining this “new blood” with training existing staff and working with specialist service providers, organisations can build a strong resourcing plan for the years ahead.

    Triton Consulting offer a range of DB2 z/OS training, resourcing and outsourcing services.

  • 17 May 2011 12:00 AM | Anonymous

    The pros and cons of cloud computing may be regularly debated by the IT team but what about the rest of the business? How many Financial Directors (FDs) are still afraid to take the plunge? Most are familiar with the compelling benefits of lower capital expenditure, improved financial control and rapid deployment – all key issues when facing an over-worked IT department struggling to find the time to even discuss business requirements, let alone oversee the delivery of new software. So why the reticence?

    It is, perhaps, understandable that an FD or CFO may be unwilling to rush headlong into the cloud with a business critical application such as the core financial software. But there are other aspects of the financial portfolio that provide a fantastic opportunity for testing the viability of the hosted model.

    As Karen Conneely, Group Commercial Manager at Real Asset Management, explains, those companies that opt for a hosted Fixed Asset Register, can rapidly discover the benefits of the cloud and prove the long term viability of the hosted model for the entire software portfolio.

    Flexible Business

    Demand is growing globally for hosted solutions, as organisations wrestle with continuing financial constraints that are now seriously hampering ongoing business development. With many organisations continuing to pare back internal IT resources, business managers are having to wait months to get access to critical IT skills. They are facing a near complete lock down on the capital expenditure required to deploy much needed new software solutions; indeed they are even struggling to ensure compliance-critical upgrades of financial software are completed on time.

    In contrast, the option to leverage a highly secure, hosted third party solution that can be delivered within days, rather than months, is compelling. Add in the appeal of monthly or annual subscription rather than the huge upfront cost of a perpetual licence, and the hosted model has clear financial and business value.

    So why is it that the IT department rather than the FD is driving the move to the cloud? Is this reticence based on educated mistrust and a justifiable decision not to hand over responsibility for critical financial data to a third party or a simple fear of the unknown?

    Buying Decision

    The decision on whether or not to exploit the benefits that the hosted model can offer is business critical. And while FDs will be understandably reluctant to trial this approach with the core ERP or Financials software, why not dip a toe in the water with other key applications, such as fixed asset management? Indeed, a hosted fixed asset register offers a raft of additional benefits. Automated upgrades ensure the software is always up to date – a key consideration for compliance requirements, particularly in relation to the latest IFRS and SORP regulations affecting UK organisations; while the hosted model also ensures the upgrade process can be achieved without any disruption or dent in user productivity.

    Furthermore, a hosted model provides access to the system from any location, allowing the FD – or other members of the finance team – to run reports, analyse asset information and check depreciation at any time, further boosting productivity.

    With the option to get the new solution up and running within just five working days, organisations can rapidly meet corporate demands for improved fixed asset management, including the adoption of mobile asset recording via PDAs. With minimal up front expenditure, the improvement in asset accuracy combined with the reduction in manual overhead delivers a very quick return on investment (ROI).

    Measured Decision

    Of course, before any such buying decision can be made, FDs need to understand exactly what is on offer. Security is obviously key – no organisation wants to expose its list of fixed assets to the world at large.

    The options are clear: does the business want to opt for a dedicated hosted server, or the slightly less secure, and less expensive, cloud based solution where resources are shared on a virtual machine? Either way, the servers should be located in a highly secure, multi million pound data centre facility that offers security far higher than that of in-house systems.

    Organisations also need to consider how the business will access the system? One option is a dedicated Virtual Private Network (VPN), which would further reinforce the security level, but organisations must ensure the communications bandwidth will deliver the required performance.

    Conclusion

    Whatever route a business decides to explore, there is growing pressure on FDs to at least try out the hosted model. Yes, a large proportion of risk averse FDs may well be unwilling to opt for a major financials implementation as a first venture into the hosted model. But even if the global financial situation radically improves and IT suddenly receives a massive input of resources, the finance team should be at least considering the financial and speed to delivery benefits on offer. It is time to test the waters of cloud computing.

Powered by Wild Apricot Membership Software