Industry news

  • 15 Mar 2017 12:00 AM | Anonymous

    The Global Sourcing Association (GSA) have unveiled the shortlist for the GSA Professional Awards that will be held in Manchester this May. The GSA Professional Awards celebrates the teams and individuals that demonstrate exceptional sourcing practice, epitomising the value that the sourcing industry adds to businesses around the globe. Massive congratulations to everyone who has been nominated. Click here to view the shortlist.

    The GSA Professional Awards are the finale to the GSA Northern Conference, an event looking at the key role that organisations in the North of the UK in the wider sourcing industry.

  • 14 Mar 2017 12:00 AM | Anonymous

    Manufacturing in Sri Lanka is growing at pace according to the Central Bank of Sri Lanka who posted the latest purchasing managers’ index (PMI) showed good growth prospects in the sector. With a score of 56.1 in January 2017, rising demand and stronger sales are seeing the index into positive territory, any score above 50.0 indicates growth. However, fears are growing that the industry will have a labour shortage as a skills gap is developing. It is urgent that the gap is addressed as Sri Lanka is hoping to get approval on its GSP+ trade deal with the European Single Market in May, potentially opening new trade routes for the economy. To read more click here.

    To read our review on Sri Lanka as a sourcing destination, click here.

  • 14 Mar 2017 12:00 AM | Anonymous

    It has been estimated that as many as 91% of the ‘facts’ from Donald Trump’s election campaign are untrue. The scale of this phenomenon means that the denial of false information has ceased to be effective, because such messages are drowning in a sea of memes, tweets, catchy titles and brainless posts. No wonder that Oxford Dictionaries declared post-truth the word of the year for 2016. In a broader sense - not just in political terms - this could be due to the phenomenon of data-pollution. Just as Polish cities are suffocating in smog, virtual reality is suffocating from too much information. Experts from Qlik say that this phenomenon is so severe that it will come to define technological trends in the coming years, just as with business. Certainly this is the case in the field of Business Intelligence. 2017 will be the beginning of the fight against data illiteracy, which is the process of spreading the skill of “reading data”, its analysis, verification and selection. Other trends for 2017 are Big Insights, business intelligence based on context, and the increasing use of data analysis tools by employees at all levels.

    Data-pollution

    It is estimated that by 2018, 80% of data stored will be completely useless, with neither the possibility nor sense of processing it. This is directly related to the abovementioned phenomenon of data-pollution. Infrastructure for data storage is cheap and widely available, so companies are producing an increasing number of bytes - unfortunately, their value is questionable at best. The collection of such data is often art for art's sake, without purpose and strategy, just a vague idea that it may prove useful sometime down the track. The result is that even information which is important to a company often dies in the black hole which is the database. Such a situation fails to facilitate the wider use of IoT, which is the Internet of Things. Like every great idea, which originally was to serve the good of humanity (economical and ecological houses or cities, the comfort and convenience of senior citizens and people with disabilities, etc.), the Internet of Things is becoming a caricature of itself. The Internet can be connected to absolutely everything from the kettle to the cat’s litter tray, collecting terabytes of completely useless data. Wired magazine mentions that the ironic term the Internet of Shits is ever more popular - which basically means the imminent death of ideas, at least in their present-day, gadget-like form.

    Big Insights and data visualization based on context

    Everything points to the fact that the coming years will mark the end of the Big Data fetish and the beginning of Big Insights, which is a critical approach to the data being processed. And there will be more and more of this data, which will be more nuanced. Expanded reality and IoT will bring about the contextualization of data in the real world, which will enable the capture of specific events (our actions, decisions, and behavior) in a particular place and time. And this will further blur the boundary between the physical and virtual worlds. The game Pokemon Go is just one such example. This also means that business analytics will need to exceed this limit.

    Data analysis must be based on an ever wider context. Otherwise, the company runs the risk of operating in a virtual bubble. A similar phenomenon is now being observed by social networking researchers, who have noticed that their users operate in an environment of friends who are similar to each other, with access to selected information served to them depending on the choices made (the number of likes) and calculated by preference algorithms. This is the so-called filter bubble. Of course, the image of reality which thus arises is false, distorted, and is also harmful in many respects because it means that our choices influence the shape and content of the information presented to us. For business this situation is equally dangerous: a company functioning in the business reality created by the paradigm of their own data is on the direct route to being isolated from the expectations of customers, the situation on the market and, of course, to financial disaster. By the way, it is completely unaware of this danger - because of course it uses the most modern IT solutions. The conclusion is obvious, it is not enough to analyze their own data – it is ever more important to confront this data with external data and take that into account in the decision-making process. Even if - and perhaps especially - when such data makes us uncomfortable and disturbs our comfortable perspective.

    The democratization of data analysis

    On the one hand, we must decide what data to collect, but on the other hand, we must learn to read the data. In companies it will mean the dissemination of tools for business intelligence. But what does this mean exactly? Well, access to advanced analytical tools can no longer be reserved exclusively for top-level executives. Access must also be granted to all employees, who can more effectively carry out their tasks thanks to the use of data. Not only that - analytical initiatives (i.e. how and what is to be analyzed) must be bottom-up, because every employee knows their area of operation best and knows what data is most useful. An employee also adds his own input, a unique perspective, which significantly reduces the risk of enclosing decision-makers in a virtual "filter bubble" distorting the image of reality. Companies must therefore develop a new complex ecosystem of Data - People - Ideas. The IT department must be at the center of this, and must be equal to the task in terms of the provision of relevant data and the mechanisms for processing it.

    This is obviously a much more complicated task than simply implementing the appropriate Business Intelligence tools - entrepreneurs should in fact change the operation of their businesses, focus on the education (training) of employees in the framework of acquiring, analyzing and using data on the job. Data analysis soon ceases to be a narrow specialization for IT people only, but becomes a key competence of every employee, regardless of his position - it can be considered on a par with language skills and ability to work in a group in one’s CV, without which employment in a modern company is practically impossible.

    Author: JCommerce

    Article from NearshoreIT - Blog

  • 14 Mar 2017 12:00 AM | Anonymous

    Sometimes there comes a date which must be circled on the calendar. It looks different for different people: for a programmer it can be a "spaghetti code" where it is necessary to add a new feature, and for an IT Director it will be the need to implement a new tool and integrate it with a complex ecosystem of years of patchwork IT infrastructure. Something that connects these situations, a specific phenomenon which often occurs at the last minute, may be something we are not yet aware of – the phenomenon of Technological debt. This debt, even subconsciously, gives us a sense of comfort, as it allows us to indefinitely postpone hard decisions, effort, or even admitting a mistake. And this is life on credit, which Billy Gibbons warned of when he sang: "It's too easy, it's too easy to feel good," lyrics which in this case are found in a bitter song about the consequences of postponing things until a later date.

    Get to know your enemy

    Technological debt means additional work that must be done in order to accomplish a task, due to past neglect within the project. This phenomenon often occurs in IT projects when negligence in terms of the work done means that there is a time-based debt which should be devoted to getting the project up to the expected state. In the earlier example of the IT Director, facing the challenge of implementation of new tools, paying off the debt will be based on the fact that integration with obsolete and poorly-documented infrastructure, which may have been used for years, is likely to be a long and painful process. In addition to the implementation process, a lot of work will have to be undertaken to adapt the entire system and repair unsolved problems which have built up over the years.

    Debt is also used to refer to problems that arise in sloppily written code. Personally, I am inclined to say that the problems themselves are not Technological debt, but more interest on the debt incurred. We incur debt while doing any activity that contributes to the delivery of any code which is not quite up to expectations, not only in terms of the tasks to be undertaken, but also performance, readability, maintainability, and how it can be developed in the future. There may be many reasons for this state of affairs.

    The sources of Technological debt

    To face up to a challenge which puts us in Technological debt, we must take a look at the causes, among which the following phenomena can most often be found:

    • lack of employee involvement in the tasks to be undertaken;

    • insufficient coverage of functionality in terms of testing;

    • lack of automatic testing;

    • outdated tools / technologies used in the project;

    • lack of experience of the team;

    • time pressure;

    • lack of documentation / low quality of documentation.

    This list is not complete, of course, and some aspects may be closely linked to each other. Programmers can write low quality code for many reasons: because of a lack of motivation, knowledge or experience, or because of time pressure or lack of proper tools. The programmer is often only an indirect cause of the debt, as management is the responsibility of the project manager who makes decisions regarding time, tools, technology, and how they are allocated – without outside input. Sometimes it is difficult to identify the causes of the debt. Small errors in a project or strategic decisions - such as the need to provide a solution in a very short time – can lead to the creation of such debt. The origins of the phenomenon in each project can vary and are very complicated, but it does not change the fact that, regardless of its origin, Technological debt must be managed effectively.

    Debt’s not always that bad

    It would be a mistake to unambiguously define Technological debt as something that should always be avoided at all costs. Just as a loan may allow the company to spread its wings, Technological debt incurred reasonably can be helpful in many cases.

    Technological debt may be divided into three types, which clearly show that not all debts are created equal:

    1) Naive debt – resulting from negligence, bad practices, and immaturity in business.

    2) Unavoidable debt – debt which we are not able to predict. Good decisions taken today may be a cause of debt in the future.

    3) Strategic Technological debt – debt is incurred consciously, when the benefits incurred are greater than the consequences.

    It may turn out that the incurrence of debt brings with it very tangible benefits, especially in such situations as when the project hangs in the balance, and providing a sufficient batch of a program becomes "to be or not to be" for the project. The decision to incur such debt could even save the whole project (and sometimes the entire company). One should, however, take extreme care when making this type of decision, because poor management of the resulting debt can lead to disaster.

    Am I in debt?

    If we are aware of our debts, we are able to pay them off regularly - the debt itself is not a problem if it doesn’t harm ongoing operations. Ignorance in this case is absolutely not bliss, because we do not expect the impact, nor do we know when it will occur. Fortunately, in IT projects, symptoms that indicate the existence of the Technological debt appear quite quickly and are visible to the naked eye.

    Here are a few diagnostic questions:

    • Does the programming solution work slower and slower?

    • Is there partial or total downtime in the operation of the system?

    • Are the same errors repeating themselves?

    • Is the time taken to implement new solutions constantly increasing?

    • Does the application work slowly?

    • Are your programmers reluctant to work on the project?

    • Did you push your team to implement new functionalities quicker than planned?

    • Are there instances of errors which are difficult to recreate or solve?

    Even if the answer to all questions is no, you should not feel overly safe. Some experts are of the opinion that Technological debt is a permanent element of the possession or development of software and IT infrastructure.

    So let’s assume that Technological debt occurs in every IT project in some form. This means that the effective management of the debt is very important. In other words, it is not enough to merely control and neutralize the effects – much more important is a methodical approach to the quality of the application, and preventing the occurrence of debt where we don’t want to incur it.

    How to manage debt?

    You need to manage debt and fight it on all fronts. The following are of crucial importance:

    1) Building awareness of the importance of quality within the company. Quality itself must be seen as a value which we should care about. Without this, it’s difficult to manage debt, because we can’t do much without the proper approach from employees. Their involvement is key.

    2) Controlling processes. Constant feedback about what’s going on in the project helps us to react quickly when issues arise.

    3) Quality assurance. Caring about the quality of software, getting it checked by specialists, not just in terms of testing, but including all aspects of quality.

    4) Tests. The quicker testing begins, the sooner problems can be found. Tests at the level of documentation and unit testing eliminate debt very quickly.

    5) The application of best practices, such as:

    a. adherence to rules for naming functions, procedures, etc.;

    b. the application of coding style, involving the introduction of appropriate indentation;

    c. the creation of technical documentation;

    d. the prevention of basic mistakes, e.g. table overflow, problems with the initialization of variables etc.;

    e. management of versions/backups;

    f. refactoring - the improvement of the code in order to obtain better readability and easier maintenance, future development;

    g. algorithmics: simplifying functions;

    h. pair programming – to create better quality code;

    i. code review: reflecting on those solutions used, and the removal of visible errors.

    6) Training employees.

    7) Promotion of unit testing. As a basic tool, including TDD as a standard tool of creating software for programmers.

    Just as before, unfortunately, this list is incomplete, as debt can have very different sources, often specific to a particular company, or even a project. A company should therefore develop its own technique of debt management – most importantly, it should not ignore debt until it's too late – until disaster strikes and the bailiffs knock on the door.

    Author: Leszek Zieliński, JCommerce

    Software Quality Assurance Engineer.

    Since 2011, he has implemented and managed QA projects. An expert in the field of manual and automated testing. Fascinated mostly by BDD solutions. A graduate of the Silesian University of Technology. After hours he’s a social activist, musician, and triathlete.

    Originally published on NearshoreIT-Blog

  • 13 Mar 2017 12:00 AM | Anonymous

    For automotive brands, customer experience is easy. It’s all about the swish showroom, smooth test drive, and slick salesman spiel – right?

    Wrong. The automotive industry is changing, and with it, the expectation for brands to deliver an end-to-end customer journey. A journey that takes the consumer from browsing on mobile, through the showroom, into a test drive and ending with purchase.

    Take Tesla, who are redefining how we think about the buying process, with their streamlined straight-to-buy site and customised design studio. Or Honda, whose VR driving experience unveiled at this year’s CES uses geotagging and interactive content to immerse the user in a virtual test drive.

    But as more and more people expect to browse, explore, and purchase all via digital, there are even more points at which brands can lose their customers. And it’s becoming a huge issue – exactly as we found at our Excellence in Customer Experience event held in February in conjunction with the Global Sourcing Association.

    The event covered everything from how to measure and improve CX, to a panel discussion on the new technologies driving customer experience. But how did we get our audience really getting to grips with how customer experience affects purchase?

    We set up a workshop to assess the mobile web CX of ten major automotive brands. Users had to complete three tasks on their mobile devices which were scored using a featherweight model of our CX Score. Each group had to find the garage fit, NCAP safety rating, and the dealership’s contact details of a specific model available nearby. And the results were startling.

    Own your space

    Let’s start with the positives. Out of 10 major British and International automotive brands, Seat came in at top place. Why’s that? We noticed that the brands who ended up with the best scores were those who had the best integrated digital experience. That means that our group used external sites to help them find the information they needed – and it worked.

    This seemed to be a trend. Groups who used aggregators which directed them back to the brand’s site ranked their overall customer experience higher. Users researching the Toyota Yaris successfully completed the tasks, but poor search optimisation in aggregators meant they were rarely directed to the most helpful site, straight away affecting their overall score.

    If users can quickly find out what they need to know, that’s one half of your customer experience sorted. But if you’re losing customers to aggregators because your site doesn’t meet requirements, that’s poor CX. Brands need to own the spaces where customers are interacting with their brands at every level. You’d think in this day and age that would be second nature, but our findings showing it to be far from the case.

    Think best practice

    Moving back to the brands’ sites themselves, and testers were regularly reporting poor customer experience. BMW’s navigation was deemed ‘laughable’ and landed it in the bottom three, while Audi’s site looked good but wasn’t comprehensive and failed in two out three of the tasks.

    And yes, it may sound obvious, but sites need to be easy to navigate with a helpful (and working…) search function, an intuitive layout and a clear, fair value exchange. Ford and Peugeot were ranked low because their poor CX meant lots of scrolling, a clunky search function, and users ending up on different tabs and PDFs which interrupted their journey.

    Once brands have this best practice in hand, they can focus on making their customer experience memorable. While users found Fiat’s site ok to navigate, as one user put it ‘if you didn’t want a Fiat 500 before you went to the website, you wouldn’t want one after’. Inspiring content is an essential element in delivering excellent customer experience.

    Join the dots

    While the main site may be the main event, auto-manufacturers need to bear in mind their entire brand ecosystem. We found users had to navigate a number of individual branded microsites – and that’s where they found inconsistencies.

    Honda’s dealer websites may have all the necessary information, but they weren’t similar enough to make them easy to compare. Similarly for VW, the main site had positive feedback but the details on the dealership sites were poor.

    This disconnect can be hugely detrimental, even if a company’s main site is strong. Brands need standardised control over microsites to ensure excellent customer experience in all user interactions.

    So where did that leave the overall standings?

    1. Seat Leon

    Came well ahead in first place for its performance on Google and other aggregators.

    2. Kia Picanto

    Site had good content but users found it difficult to find exactly what they were looking for, and wouldn’t go back.

    3. Honda Civic

    Good main site but overall experience let down by poor dealership sites.

    4. Fiat 500

    Website was easy to use but it was nothing special and generally uninspiring.

    5. VW Polo

    Microsites weren’t standardised enough, creating inconsistencies and made it difficult for the user to compare information.

    6. Toyota Yaris

    Dealership sites were ok but the overall experience wasn’t exciting enough to get users wanting to buy.

    7. Audi A1

    The site looked good but wasn’t comprehensive: users couldn’t find the information they were looking for.

    8. BMW Mini

    Poor overall CX, with searching throwing up 404s and a poor value exchange when users were asked to input information too early on in the journey.

    9. Peugeot 208

    Lots of disconnect between pages, with lots of scrolling needed to find information and major loss when the website didn’t mention the 5 star NCAP rating.

    10. Ford Fiesta

    Poor search tool, with users ending up on different tabs and PDFs making the journey difficult and too complicated to find necessary information.

    Snapshot: what we learned from automotive CX testing

    1. It’s the simple things that make the biggest difference

    When users are unable to complete simple tasks, it’s the basic functions of customer experience which need to improve first. Brands need to minimise the number of steps in the journey to purchase by simplifying layout and keeping users on the site. Once customers are able to easily find what they’re looking for, they’ll have time for the more emotive content. And that’s what will get them coming back – and even better, buying.

    2. External aggregating sites are owning the customer experience

    While overall digital experience is key in delivering best practice CX, brands are losing out to aggregators who are owning good navigation. Whilst search optimisation is one half of the battle, the other is improving your site’s search functions and internal navigation so users will trust your brand to find them what they want, before they look elsewhere.

    3. There’s a disconnect in quality across the digital experience

    You can’t expect customers to stay on one site throughout their journey to purchase. Which is why brands need to ensure the quality of their main site extends to microsites and beyond. Wherever there’s branded content, it needs to meet the same standards. That will help create a seamless and coherent experience which will guide your customer to purchase.

    Alastair Cole, Chief Innovation Officer, Partners Andrews Aldridge

  • 13 Mar 2017 12:00 AM | Anonymous

    Good CX saves money and makes money. A 2% increase in customer retention has the same effect as decreasing costs by 10%, while 86% of buyers will pay more for good customer experience.

    That’s why we hosted a CX Event with the Global Sourcing Association on Excellence in Customer Experience. The event focused on how to measure and improve CX, introducing revolutionary new products such as CX Score and instigating a workshop which got the audience rating mobile web CX for major automotive brands.

    The real revelations came in the panel discussion, which saw leading experts in the industry debate the new technologies driving improvements in customer experience. It covered everything from new channels for brands to interact with customers, to understanding customers’ omni-channel journey.

    And through this, the overriding message for providing good CX was clear: don’t overcomplicate what customers want as an outcome.

    Keep it simple, stupid

    The scope of customer experience is changing with the increasing impact of AI. Not only do customers expect more from their brand experience – whether that’s through innovative tech or tailored personalisation – customers are actually choosing products because of their AI.

    But with brands keen to jump on the AI bandwagon there’s a danger of overloading the consumer with information. With so many opportunities to overcomplicate the customer journey – through different channels, targeting, tagging, and interaction points – brands need to focus on simplicity.

    A shining light here is Aviva, whose simplistic layout and intelligent architecture make sure the user doesn’t get lost. Their ‘Shape My Future’ tool guides you to a personalised page which feels unique but doesn’t demand any superfluous information. It really doesn’t matter if brands do deliver slightly less if they’re doing so in a better way.

    Put down the scattergun

    Tired of being fed the same ads over and over again, even though you bought the product they’re selling you 3 months ago? That’s poor CX – and there’s no excuse for it. The solutions for clever retargeting are out there, and brands need to use them.

    Highly specialised retargeting tools mean brands can easily avoid the ‘send to all’ approach. Instead they can map each individual to send them a personalised message, and use that first party data to feed back out into wider ecosystem.

    It’s those brands who capture people at the right moment and keep them satisfied before delivering the content they need that will make the customer journey more seamless.

    Tech for tech’s sake

    Keeping CX simple means using AI in the right way to get a focused, personalised outcome. It also means using AI in the right places, to keep the customer journey as simple as possible.

    Unnecessary AI actually has a negative impact on CX. One panellist spoke of an instance where chatbot features are only available once you’re through to your basket. At this point in the journey, your questions have probably already been answered, so it’s annoying and unhelpful that the chatbot wasn’t delivered at better point in their journey – say on the FAQs page.

    Sounds obvious? It is. If brands are investing in innovative tech, they need to learn where and when to place it in order to get the best results.

    Robots have feelings too

    Using AI to fine tune your customer experience is risky in more ways than one. While customers are constantly searching for the quickest, most efficient brand interactions, they still crave the human factor. How often are we left frustrated by a non-responsive robot at the other end of the phone?

    The human factor makes AI empathetic, intelligent, and nuanced – which is exactly what you’re looking for when you’re interacting with customer services, for example, and getting a reply to a complaint. The ‘humanity’ in AI also makes an experience memorable and special as it conveys a brand’s personality – essential for good CX. Netflix are doing it well, with their IBM Watson ‘Papal Artificial Intelligence’ trolling Twitter users with quotes from the Bible to promote the new season of Young Pope – and more brands need to follow suit.

    Outcome beats output

    You’d be forgiven for thinking that good CX demands a lot of contradictions: simple but intelligent, efficient but personal, multi-faceted but specialised. But it all makes sense when brands remember to put the customer outcome first.

    Keeping the customer journey simple, tailoring personalised messages, using AI only where it’s wanted and making it feel human are all ways of putting the human experience front and centre. That doesn’t just mean their journey to purchase, it also means their experience with the product once they’ve bought it.

    So how do brands show that the outcome is the most important factor in customer experience? It can sometimes take a bold move. Lloyds was one brand mentioned by the panel for their moving, stereotype-inverting campaign which puts the focus on the customer end point. We recognise that good banking has played a part in the outcomes they’ve achieved, but the focus is on the achievement itself: a same-sex marriage, a child’s first day at school, a first kiss.

    Brands who recognise what consumers want at the end of their journey will find it easier to create an experience which will get their customers where they want to be.

    By Imogen Lees, Content Editor at Partners Andrews Aldridge

  • 13 Mar 2017 12:00 AM | Anonymous

    Lloyds Banking Group plans to move about 1,900 staff to IBM in a restructuring plan aimed at reducing costs but which could see the bank's security weakened, according to a trade union. Chief Executive Officer Antonio Horta-Osório is looking to shed thousands of jobs to streamline the business, support dividend payments and boost the share price as the government prepares to sell down its remaining stake in the bank this year. You can read more here.

  • 13 Mar 2017 12:00 AM | Anonymous

    US chipmaker Intel is taking a big bet on driverless cars with a $15.3bn (£12.5bn) takeover of specialist Mobileye. Mobileye and Intel are already working together, along with German carmaker BMW, to put 40 test vehicles on the road in the second half of this year. Intel expects the driverless market to be worth as much as $70bn by 2030. Announcing the deal, Intel said that as cars "progress from assisted driving to fully autonomous, they are increasingly becoming data centres on wheels". You can read more here.

  • 10 Mar 2017 12:00 AM | Anonymous

    Hounslow Council has announced its intention to award a Corporate Services Contract to Liberata. The contract will see Liberata deliver a range of services including revenues and benefits assessments, transactional finance, HR administration, and payroll services on behalf of the Council. The initial contract is for seven years, with the option to extend it for a further three years, and is potentially worth up to £75 million. Click here to read the full story.

    The GSA is hosting a public sector sourcing event in March, click here to learn more.

  • 8 Mar 2017 12:00 AM | Anonymous

    The budget today unveiled new support for disruptive technologies and innovation in the form of £270m of funding to put the UK "at the forefront" of disruptive technology industries. The funds for the hi-tech research come from the National Productivity Investment Fund (NPIF), set up by the government last year. The sums, however, are smaller than those announced by some other countries. For example, the US Department of Transportation proposed a 10-year plan to invest $4bn (£3.3bn) in self-driving cars. It’s good to see that disruptive tech and digital infrastructure is getting some recognition from the government, however actions speak louder than words and the government faces stiff competition on the global market for tech research. Click here to learn more.

Powered by Wild Apricot Membership Software