q
Today’s clash of crises and technological development has formed a crucible of innovation for the manufacturing sector – but companies should not jump blindly into change for the sake of it…
Many people fear that digital technology is going to cause mass unemployment as computers take over jobs that people used to do, performing them faster, better, more safely and at a lower cost. And there is no doubt that some jobs will be destroyed by automation.
That’s nothing new. Automation has been happening for several hundred years and during that time, in the UK at least, employment rates have increased rather than decreased. Of course, technology will change the nature of employment. In the United States, farming jobs made up 40 per cent of the workforce in 1900; by 2000 that had dropped to 2 per cent. Similarly, 25 per cent of jobs in 1950 were in manufacturing, this dropped to 10 per cent by 2010. In both cases, new jobs, sometimes in new industries, eventually offset the losses.
Technology will cause short-term challenges as old skills become redundant, and some people find it hard to reskill. But automation, as it gets rid of dull, dirty and dangerous jobs, providing people with jobs that create more value, will benefit society. The challenge for employers is ensuring that workers have the skills they need to flourish in a digital economy.
Digital literacy and skills
Digital technology underpins business, and today’s workers need to be digitally literate. This is not always the case, and there is often a need to educate people in the basics of digital technology, beyond using social media and videogames. Workers need to demonstrate digital literacy through a number of skills, including:
• Online learning, the ability to use technology to locate trustworthy resources online and to use that access to learn
• Communication and collaboration, the ability to share complex ideas and work in teams and to collaborate effectively and professionally
• Computational thinking, the ability to collect and analyse data
• Problem solving, the ability to use technology to find or generate innovative solutions that are realistic and effective commercially
These are all skills that any organisation will benefit from. But they need to be backed up by appropriate knowledge. And, with technology changing so rapidly, the knowledge that workers have must change rapidly too.
The truth is that business skills acquired yesterday will be obsolete tomorrow. That’s frightening. And it requires a sea change in the way that many organisations treat training and skills development. It’s no longer sufficient to employ someone with a particular set of skills and expect these to be relevant several decades (or even several years) later. Life-long learning is required.
And in some cases, where jobs simply disappear, workers will need to develop the mental agility to reinvent themselves. For example, a 20-year old lorry driver today will almost certainly need to accept that their job will have ceased to exist by the time they are 40. They will then need the confidence to carve a career using the skills they acquired while driving a lorry to become, perhaps a drone pilot (hand-eye co-ordination), a counsellor (ability to manage stress and loneliness), or something they simply cannot foresee at the moment. Flexibility, a willingness not to be defined by your job, and a hunger to learn new skills constantly, will be essential.
Knowledge management
Organisations need digitally skilled employees. But they also need to have strong digital capabilities. An essential organisational capability is knowledge management.
Knowledge (not data as some might say) is the lifeblood of organisations. It can be defined as an understanding, gained by personal experience or tuition (someone else’s experience), of the things that are happening in and to an organisation.
Knowledge is important because if you share it you can help people understand what might happen under a particular set of circumstances. It is much more useful than mere information, which only tells you “what, when and who”. Knowledge can also tell you “why and how” and might even tell you what to do next.
Information is seductive. It easy to store and share, easy to manage. But it isn’t much use. Knowledge, because it includes personal experience, is far less easy to manage but it is much more useful. Therefore, organisations need to think about how they can collect and curate knowledge.
To share knowledge usefully, it must first be expressed by the person who has it, and then recorded and stored in a way that enables people to find it. People can turn their knowledge into shareable assets by talking, drawing, writing or demonstrating. Written assets are simple to store, and digital technology makes them searchable by:
• Identifying words (including names)
• Identifying sentiments that were expressed
• Taking account of context such as dates, places, the meaning of paragraphs (rather than single words) and even slang and irony
Another type of knowledge asset is the spoken word. Technology can help here by translating speech into written documents that can be searched. Technology can even provide some extra analysis by providing insight derived from body language or tone of voice of the speaker.
Videos and drawings are harder to convert into useful assets and may require human intervention to tag them with keywords and descriptions to enable them to be found.
Once the knowledge has been made explicit, it can be curated – sorted, prioritised and described so that other people can find it and use it. Again, technology can help here – for instance, with sorting and describing knowledge – although prioritising its importance effectively is likely to require humans for some time yet.
Employers need to ensure their workforce constantly develops new skills. And they need to ensure that their organisations have the capabilities to encourage the use of new skills and the technology to capture knowledge as it changes. These are significant challenges. But unless they are met, today’s organisations will struggle tomorrow.
Many people fear that digital technology is going to cause mass unemployment as computers take over jobs that people used to do, performing them faster, better, more safely and at a lower cost. And there is no doubt that some jobs will be destroyed by automation.
That’s nothing new. Automation has been happening for several hundred years and during that time, in the UK at least, employment rates have increased rather than decreased. Of course, technology will change the nature of employment. In the United States, farming jobs made up 40 per cent of the workforce in 1900; by 2000 that had dropped to 2 per cent. Similarly, 25 per cent of jobs in 1950 were in manufacturing, this dropped to 10 per cent by 2010. In both cases, new jobs, sometimes in new industries, eventually offset the losses.
Technology will cause short-term challenges as old skills become redundant, and some people find it hard to reskill. But automation, as it gets rid of dull, dirty and dangerous jobs, providing people with jobs that create more value, will benefit society. The challenge for employers is ensuring that workers have the skills they need to flourish in a digital economy.
Digital literacy and skills
Digital technology underpins business, and today’s workers need to be digitally literate. This is not always the case, and there is often a need to educate people in the basics of digital technology, beyond using social media and videogames. Workers need to demonstrate digital literacy through a number of skills, including:
• Online learning, the ability to use technology to locate trustworthy resources online and to use that access to learn
• Communication and collaboration, the ability to share complex ideas and work in teams and to collaborate effectively and professionally
• Computational thinking, the ability to collect and analyse data
• Problem solving, the ability to use technology to find or generate innovative solutions that are realistic and effective commercially
These are all skills that any organisation will benefit from. But they need to be backed up by appropriate knowledge. And, with technology changing so rapidly, the knowledge that workers have must change rapidly too.
The truth is that business skills acquired yesterday will be obsolete tomorrow. That’s frightening. And it requires a sea change in the way that many organisations treat training and skills development. It’s no longer sufficient to employ someone with a particular set of skills and expect these to be relevant several decades (or even several years) later. Life-long learning is required.
And in some cases, where jobs simply disappear, workers will need to develop the mental agility to reinvent themselves. For example, a 20-year old lorry driver today will almost certainly need to accept that their job will have ceased to exist by the time they are 40. They will then need the confidence to carve a career using the skills they acquired while driving a lorry to become, perhaps a drone pilot (hand-eye co-ordination), a counsellor (ability to manage stress and loneliness), or something they simply cannot foresee at the moment. Flexibility, a willingness not to be defined by your job, and a hunger to learn new skills constantly, will be essential.
Knowledge management
Organisations need digitally skilled employees. But they also need to have strong digital capabilities. An essential organisational capability is knowledge management.
Knowledge (not data as some might say) is the lifeblood of organisations. It can be defined as an understanding, gained by personal experience or tuition (someone else’s experience), of the things that are happening in and to an organisation.
Knowledge is important because if you share it you can help people understand what might happen under a particular set of circumstances. It is much more useful than mere information, which only tells you “what, when and who”. Knowledge can also tell you “why and how” and might even tell you what to do next.
Information is seductive. It easy to store and share, easy to manage. But it isn’t much use. Knowledge, because it includes personal experience, is far less easy to manage but it is much more useful. Therefore, organisations need to think about how they can collect and curate knowledge.
To share knowledge usefully, it must first be expressed by the person who has it, and then recorded and stored in a way that enables people to find it. People can turn their knowledge into shareable assets by talking, drawing, writing or demonstrating. Written assets are simple to store, and digital technology makes them searchable by:
• Identifying words (including names)
• Identifying sentiments that were expressed
• Taking account of context such as dates, places, the meaning of paragraphs (rather than single words) and even slang and irony
Another type of knowledge asset is the spoken word. Technology can help here by translating speech into written documents that can be searched. Technology can even provide some extra analysis by providing insight derived from body language or tone of voice of the speaker.
Videos and drawings are harder to convert into useful assets and may require human intervention to tag them with keywords and descriptions to enable them to be found.
Once the knowledge has been made explicit, it can be curated – sorted, prioritised and described so that other people can find it and use it. Again, technology can help here – for instance, with sorting and describing knowledge – although prioritising its importance effectively is likely to require humans for some time yet.
Employers need to ensure their workforce constantly develops new skills. And they need to ensure that their organisations have the capabilities to encourage the use of new skills and the technology to capture knowledge as it changes. These are significant challenges. But unless they are met, today’s organisations will struggle tomorrow.
In an edited extract from his book Digital Governance, Jeremy Swinfen Green explores the digital skills the workforce of the future will need.
The cloud has transformed business. A great majority of organisations now use it to drive digital capabilities and the cloud migration market has been growing at around 30 per cent a year. The Covid-19 pandemic, which has caused major problems for the management of on-premises IT systems, is accelerating this trend.
However, for many organisations, cloud migration is still work in progress. And this is particularly true of large organisations with business-critical legacy systems. All too often there is inadequate documentation. And the specialist developers who originally implemented the code may well be long gone from the organisation. Migrating these robust and time-proved systems to a new environment can be a major risk to operational effectiveness.
In addition, simply migrating old applications to the cloud as they stand is likely to be ineffective and risks undermining advantages of cloud economics. There are many technical challenges to be faced when moving legacy applications to the cloud, such as refactoring applications to adopt modern capabilities such as APIs and microservices.
And yet the pressures to migrate to the cloud, including lower costs, security, and the need for operational flexibility, are still as strong as ever.
Organisations faced with the need to migrate legacy systems have three approaches that they can take: rehosting, rewriting or refactoring.
The quickest and easiest approach is to rehost all business operations in the cloud, simply taking applications and their associated data from local servers and placing them on cloud servers. This approach, often known as “lift and shift”, has its benefits. However, organisations cannot simply lift and shift all of their applications as this would be suboptimal use of cloud economics for complex applications. It would also mean that some of the cloud-native benefits such as continuous real-time deployment may not be possible.
This is because in a lift and shift context, the underlying code of the applications being migrated is not examined or altered in any way during rehosting. It runs as it did before, simply in a different environment. And in some cases, it runs sub-optimally.
An alternative approach is to rewrite business applications in a cloud environment, potentially re-architected in a cloud native architecture. And, if the developers have done their jobs well, the application will be perfectly adapted to new cloud environment. But, with potentially millions of lines of code involved, a rewriting project comes at considerable cost, and can take many months.
The third approach, and one that is increasingly popular, is to refactor the code of the applications. This involves restructuring the code so that it is optimised for the cloud. Often when refactoring the code, engineers rearchitect the application and try to adopt modern frameworks and concepts such as APIs and macro/micro-service architecture that, once deployed on cloud, significantly increase resilience and improve scalability, flexibility and elasticity. All of this means that, while it is more efficient than building code from the ground up, refactoring can still be a slow process with high implementation costs because it requires substantial human intervention.
One way to make refactoring more efficient is to automate it. For instance, code assessment, a process that must be undertaken at the start of any software migration, can be finished in hours when it is automated, rather than taking days and weeks as it would with a team of human developers.
Of course, automation can only go so far. Any code will need detailed configuration, and tweaks to address any environment dependencies such as checking and changing URLs. And once code has been refactored, the application will need to be tested and validated by a human. However, it is realistic to assume that three quarters of the refactoring work can be automated, representing a major saving of time and money.
One company pushing the boundaries of automated cloud refactoring is Hexaware. Its Amaze™ re-platforming service provides a number of key benefits over other migration methods. These include lower total cost of application ownership and lowered licensing costs. Other benefits are faster cloud migration at lower cost, and faster lifecycles for applications once they are in the cloud, such as real-time updates.
Hexaware claims it can deliver considerable benefits including a 50 per cent reduction in the total cost of transforming and refactoring an application, a 30 per cent increase in productivity and a 40 per cent reduction in time to market.
Successful refactoring is down to effective due diligence and analysis of the existing application architecture, complexity, dependencies, and integrations. The Amaze™ platform starts with this step. It conducts a thorough automated analysis of existing application architecture and creates a cloud-readiness report which identifies the changes required to make the application cloud ready: Amaze™ can pinpoint the exact line of code which needs to be changed.
Amaze™’s automation-based discovery tool plays a major part in mapping the existing technology stack and identifying any cloud-based requirements, thereby ensuring a comprehensive, rapid and lower-cost solution, when compared with a traditional approach.
Implementation comes next, with the code changes happening. Again, there is an opportunity to reduce costs and increase speed by automating code generation. In addition, standard practices such as adopting open-source software can be used to reduce the licensing costs. For instance, Postgres SQL can be used instead of an Oracle database, while WebLogic app servers can be migrated to cost -effective open-source options.
Refactoring also involves modernising the application. This involves breaking a monolithic application into smaller API-led macro/micro services, which Amaze™ does automatically. The newer architecture allows applications to expose capabilities over APIs, and scale independent services based on a containerised deployment. This helps the customer realise all the major benefits and flexibility that the cloud has to offer.
Testing the newly refactored application with production data also needs to happen before the newly “cloudified” application is deployed into the operational environment.
At the organisational level, it is likely that a pilot implementation would be run, with migration tested on one or two applications. Next, a fuller implementation involving perhaps a single business unit or function might be run; a limited number of applications would be migrated but at this stage key activities such as agreeing metrics and benchmarks, and agreeing guidelines and best practice, can be established. With these in place, the third stage can be to scale up the migration across the organisation. Running cloud migration this way may take a little longer, but risks are mitigated and a better outcome is likely as early lessons can be incorporated in the final architecture.
The cloud is here to stay, and pressures for migration are only increasing. Given these, it makes sense to use emerging technologies such as automation and machine learning to achieve successful migration of legacy applications at lower cost and higher speed.
For more information, please visit the Hexaware website.
How new automation technology could help businesses navigate the thorny path towards cloud migration.
Alexandra Louise Uitdenbogerd, RMIT University
You might have seen a recent article from The Guardian written by “a robot”. Here’s a sample:
“I know that my brain is not a “feeling brain”. But it is capable of making rational, logical decisions. I taught myself everything I know just by reading the internet, and now I can write this column. My brain is boiling with ideas!”
Read the whole thing and you may be astonished at how coherent and stylistically consistent it is. The software used to produce it is called a “generative model”, and they have come a long way in the past year or two.
But exactly how was the article created? And is it really true that software “wrote this entire article”?
How machines learn to write
The text was generated using the latest neural network model for language, called GPT-3, released by the American artificial intelligence research company OpenAI. (GPT stands for Generative Pre-trained Transformer.)
OpenAI’s previous model, GPT-2, made waves last year. It produced a fairly plausible article about the discovery of a herd of unicorns, and the researchers initially withheld the release of the underlying code for fear it would be abused.
But let’s step back and look at what text generation software actually does.
Machine learning approaches fall into three main categories: heuristic models, statistical models, and models inspired by biology (such as neural networks and evolutionary algorithms).
Heuristic approaches are based on “rules of thumb”. For example, we learn rules about how to conjugate verbs: I run, you run, he runs, and so on. These approaches aren’t used much nowadays because they are inflexible.
Read more: From Twitterbots to VR: 10 of the best examples of digital literature
Writing by numbers
Statistical approaches were the state of the art for language-related tasks for many years. At the most basic level, they involve counting words and guessing what comes next.
As a simple exercise, you could generate text by randomly selecting words based on how often they normally occur. About 7% of your words would be “the” – it’s the most common word in English. But if you did it without considering context, you might get nonsense like “the the is night aware”.
More sophisticated approaches use “bigrams”, which are pairs of consecutive words, and “trigrams”, which are three-word sequences. This allows a bit of context and lets the current piece of text inform the next. For example, if you have the words “out of”, the next guessed word might be “time”.
This happens with the auto-complete and auto-suggest features when we write text messages or emails. Based on what we have just typed, what we tend to type and a pre-trained background model, the system predicts what’s next.
While bigram- and trigram-based statistical models can produce good results in simple situations, the best recent models go to another level of sophistication: deep learning neural networks.
Imitating the brain
Neural networks work a bit like tiny brains made of several layers of virtual neurons.
A neuron receives some input and may or may not “fire” (produce an output) based on that input. The output feeds into neurons in the next layer, cascading through the network.
The first artificial neuron was proposed in 1943 by US neuroscientists Warren McCulloch and Walter Pitts, but they have only become useful for complex problems like generating text in the past five years.
To use neural networks for text, you put words into a kind of numbered index. You can use the number to represent a word, so for example 23,342 might represent “time”.
Neural networks do a series of calculations to go from sequences of numbers at the input layer, through the interconnected “hidden layers” inside, to the output layer. The output might be numbers representing the odds for each word in the index to be the next word of the text.
In our “out of” example, number 23,432 representing “time” would probably have much better odds than the number representing “do”.
Read more: Friday essay: a real life experiment illuminates the future of books and reading
What’s so special about GPT-3?
GPT-3 is the latest and best of the text modelling systems, and it’s huge. The authors say it has 175 billion parameters, which makes it at least ten times larger than the previous biggest model. The neural network has 96 layers and, instead of mere trigrams, it keeps track of sequences of 2,048 words.
The most expensive and time-consuming part of making a model like this is training it – updating the weights on the connections between neurons and layers. Training GPT-3 would have used about 262 megawatt-hours of energy, or enough to run my house for 35 years.
GPT-3 can be applied to multiple tasks such as machine translation, auto-completion, answering general questions, and writing articles. While people can often tell its articles are not written by human authors, we are now likely to get it right only about half the time.
The robot writer
But back to how the article in The Guardian was created. GPT-3 needs a prompt of some kind to start it off. The Guardian’s staff gave the model instructions and some opening sentences.
This was done eight times, generating eight different articles. The Guardian’s editors then combined pieces from the eight generated articles, and “cut lines and paragraphs, and rearranged the order of them in some places”, saying “editing GPT-3’s op-ed was no different to editing a human op-ed”.
This sounds about right to me, based on my own experience with text-generating software. Earlier this year, my colleagues and I used GPT-2 to write the lyrics for a song we entered in the AI Song Contest, a kind of artificial intelligence Eurovision.
We fine-tuned the GPT-2 model using lyrics from Eurovision songs, provided it with seed words and phrases, then selected the final lyrics from the generated output.
For example, we gave Euro-GPT-2 the seed word “flying”, and then chose the output “flying from this world that has gone apart”, but not “flying like a trumpet”. By automatically matching the lyrics to generated melodies, generating synth sounds based on koala noises, and applying some great, very human, production work, we got a good result: our song, Beautiful the World, was voted the winner of the contest.
Co-creativity: humans and AI together
So can we really say an AI is an author? Is it the AI, the developers, the users or a combination?
A useful idea for thinking about this is “co-creativity”. This means using generative tools to spark new ideas, or to generate some components for our creative work.
Where an AI creates complete works, such as a complete article, the human becomes the curator or editor. We roll our very sophisticated dice until we get a result we’re happy with.
Read more: Computing gives an artist new tools to be creative
Alexandra Louise Uitdenbogerd, Senior Lecturer in Computer Science, RMIT University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Automation is taking over human roles at a steady clip. But are the latest “writing machines” really all they’re made out to be?
The coronavirus pandemic has deeply impacted people in communities across the globe, sadly with significant loss of life. As the number of those recovering from the virus continues to grow, we also recognise that our world has been changed forever.
Getting basic provisions such as foods and medicines has been a major challenge for many people during the crisis, with many retailers struggling to top up their shelves and customers seemingly panic buying and stockpiling products, from paracetamol to toilet rolls.
Supporting customers and retailers throughout have been multiple groups of wholesalers, distributors, manufacturers and suppliers collaborating as players within global supply chains.
Supply chains are fascinating, dynamic and exciting – highly sophisticated, multi-layered, interconnected and interrelated distribution systems which enable companies and countries to balance supply and demand and trade more efficiently. Globally, supply chains have led a relatively settled existence for the last 50 years or so. All that is about to change, not least as a result of the crisis.
The pandemic closed many factories across the globe, as companies acted swiftly to protect the health and wellbeing of their workers and respond to rapid reductions in their inventory and massive disruption to their supply lines and logistics networks across international borders.
Some were able to maintain, and in some cases increase, production – notably some food manufacturers. Others were able to switch some of their production lines to satisfy emergency needs, such as the manufacture of life-saving products and components for ventilating machines, and personal protection equipment (PPE) for health and social care workers.
One excellent example of this was the instant supply chain created by Ventilator Challenge UK, a consortium of 21 manufacturing engineering and seven Formula 1 racing firms, led by the UK government-backed High Value Manufacturing Catapult, delivering 10,000 ventilating machines to the NHS.
Recovery from the crisis will not be instantaneous for supply chains, as individual players and businesses within the chain emerge from what may well be a sustained period of inactivity. Retaining staff and skills will be vital.
Something which will be of great interest to the government once the crisis is over is the propensity of UK firms and overseas investors, to “right-shore” (also referred to as “onshoring” or “reshoring”) back to the UK manufacturing operations that were previously located abroad. Bringing supply chains geographically closer could be a significant step in the race to rebuild resilience, reduce carbon footprints and potentially increase revenues for the treasury.
Back on the shop floor, however, what every proprietor and manager will want to know is what we have learned from all of this. Whether you are an original equipment manufacturer (OEM), or tier 1 or lower supplier to that OEM, you may well be wondering how you engineer greater resilience, sustainability and value for your business in the future.
Industry 4.0 (altertnatively describe as digital manufacturing or Supply Chain 4.0) provides a large part of the solution enabled by an array of technologies such as the internet of things (IoT), robotics and automation, machine learning, 3D printing, artificial intelligence (AI) and augmented reality (AR).
At the Institution of Engineering and Technology (IET), we believe passionately in the creation and management of supply chain ecosystems for global growth “bounce-back” in a post-coronavirus world, backed by strong and secure digital infrastructure and driven by data.
We describe such an ecosystem as a dynamic environment composed of different elements interacting collaboratively to always ensure flexibility, resilience, responsiveness, transparency and traceability.
Each connected node within the supply chain contributes to the growth of the whole system, fostering a virtuous loop of benefits for the supply chain. This can make supply chains both sensitive to change but also more resilient to change, so long as that has been factored into their design.
A toolbox of these digital technologies and capabilities can help the redesign of processes and operations to build greater resilience, thus enabling the supply chain ecosystem to better adapt to shortages and surges in the future, and be equipped to pivot quickly and smoothly.
It can connect all players within the ecosystem, providing instant and open visibility for all. For manufacturers, distributors and suppliers such visibility will be crucial in enabling where and how much stock and value resides within the supply chain and what the gaps are.
Data gathered from across the ecosystem may be analysed against agreed key performance indicators and shared among the players. Respecting the privacy of those players who require it is becoming ever easier to put into practice, with the increasing maturity and adoption of blockchain technologies. These offer new ways of permanently recording transactions within a secure peer-to-peer network.
Supply chains have struggled to balance flow and demand during the crisis. As part of their collective response, manufacturers, distributors and those within supply chains will be giving some thought as to how things can be improved for next time. (Here’s hoping there isn’t a next time!)
Even before this is all over, think about how you can make a once-and-for-all change to benefit your business. Consider and review your digital toolbox. Design stronger dynamic resilience in your supply chain ecosystem and seek out collaborative help and expertise to build it and test it out. After all, resilience and collaboration are the two words that will dominate our business vocabulary and thinking from here on in!
For more on our vision for the creation of supply chain ecosystem visit our website and download our report, Developing An Eco-system for Supply Chain Success.
For more on the Industry 4.0 technologies referred to in this article have a look at our earlier Business Reporter article, “How to make more time and money from your manufacturing operation”.
by John Patsavellas, Institution of Engineering and Technology
An age of crisis has forced us to re-evaluate and innovate our supply chains – and they’ll be all the stronger for it, argues the IET’s John Patsavellas.
Seth Ravin, CEO, Rimini Street
Shifting their focus from being overseers of data centre operations to being strategic partners with their CEOs will enable tech leaders to develop a business-driven innovation roadmap that allows the company to pivot quickly when market conditions change. But CIOs must get creative about finding resources to operationalise the strategy and roadmap, given limited IT budgets. Significant results can be achieved by leveraging a business-driven roadmap in this way – companies are freeing up 40 per cent or more of their IT budget for transformational innovation that builds resilience and enables growth – which is never more important than today.
A CEO/CIO strategic partnership is needed more than ever to enable quick pivots amid instability
Companies are experiencing unprecedented business dynamics in which global disruption and global competition are creating complex, rapidly changing technology investment priorities. It’s business as unusual for most companies, as they find themselves in constant response mode. There is no longer a status quo in the business world – companies must grow, or they will likely die.
For many CIOs, operating the data centre is now the least of their priorities. Their investment roadmaps have been side-railed by disruptive global events, and non-essential projects are being put on hold. To survive and thrive, companies must have a digital core. Almost overnight, CIOs have shifted focus to investing in solutions that serve customers primarily on digital platforms. For many companies, their digital infrastructure must be able to handle 100 per cent remote work. Those that are already digitally platformed will make this adjustment much more nimbly than those that haven’t invested or have underinvested in digital infrastructure.
What CEOs really need from their CIOs is adaptable strategic guidance that will help them pivot very quickly when disruption occurs. CEOs also need their CIOs to make wise IT investment decisions that optimise costs, save jobs, stabilise operations, and shift IT resources to these strategic initiatives. In order to do this, CIOs need to be at the table with the CFO, CEO, and CPO in setting the broader business strategy that drives the IT roadmap.
CIOs must innovate on limited budgets
Today, nearly every company in the world has a financial problem. Either revenue streams have been disrupted on the front end or cash supplies are tighter on the back end. This happened in 2000 with the dot-com meltdown, and again with the mortgage crisis in 2008. However, those scenarios were not the same as the shock that has happened today in such a very short period of time. Every business and every market segment is being impacted by the current global disruption. Governments are spending more money to keep their economies going than they can ever hope to repay. When there is revenue disruption, everything changes. Unfortunately, it takes longer for a company to cut costs than it takes for revenue to fall.
The first order of business, then, is survival. In order to survive, companies must be focused on cash. Many companies are slow-paying or trying to renegotiate with every vendor they have. Scrutiny is being focused on “need to have”, let alone “like to have” or “nice to have” projects and services. Companies are competing for a smaller pool of cash, and prioritisation is top of mind for CIOs who must innovate while keeping current systems operational. Many IT projects are being cancelled. For example, ERP refreshes that don’t bring real value in terms of competitive advantage or growth are being pushed off a year or more until companies rebuild their cash supplies.
The CIO can be the most influential person right now when it comes to IT purchasing decisions. To survive, and ultimately thrive, CIOs need to look at their spending pool and immediately make changes that align revenue on the front end with costs on the back end. In the current market, this means shifting budgets to investments in digital technologies and services that support the company’s innovation strategy, even amid uncertainty. When as much as 90 per cent of the CIO’s budget is spent on ongoing operations and enhancements to back-end systems, this leaves as little as 10 per cent for investments in the company’s innovation strategy.
The goal should be to change back-end systems as little as possible – their level of maturity makes them low-risk for breaking, so the support services needed are primarily tax and regulatory updates, and advice and counsel which can all be obtained from independent, third-party providers at a fraction of the cost paid to the enterprise software vendors.
CIOs instead should optimise the operating costs of back-end systems such as ERP and shift those funds to an IT innovation investment strategy that supports the CEO’s digital priorities. Investments in front-end systems of engagement (where important interactions with customers occur) will more likely address the CEO’s business strategies. These dynamic, customer-facing systems are where constant change is needed in order to remain competitive and where digitalisation will help attract and retain customers – the keys to surviving and thriving.
For more information, please visit riministreet.com
Seth Ravin, CEO, Rimini Street
Shifting their focus from being overseers of data centre operations to being strategic partners with their CEOs will enable tech leaders to develop a business-driven innovation roadmap that allows the company to pivot quickly when market conditions change. But CIOs must get creative about finding resources to operationalise the strategy and roadmap, given limited IT budgets. Significant results can be achieved by leveraging a business-driven roadmap in this way – companies are freeing up 40 per cent or more of their IT budget for transformational innovation that builds resilience and enables growth – which is never more important than today.
A CEO/CIO strategic partnership is needed more than ever to enable quick pivots amid instability
Companies are experiencing unprecedented business dynamics in which global disruption and global competition are creating complex, rapidly changing technology investment priorities. It’s business as unusual for most companies, as they find themselves in constant response mode. There is no longer a status quo in the business world – companies must grow, or they will likely die.
For many CIOs, operating the data centre is now the least of their priorities. Their investment roadmaps have been side-railed by disruptive global events, and non-essential projects are being put on hold. To survive and thrive, companies must have a digital core. Almost overnight, CIOs have shifted focus to investing in solutions that serve customers primarily on digital platforms. For many companies, their digital infrastructure must be able to handle 100 per cent remote work. Those that are already digitally platformed will make this adjustment much more nimbly than those that haven’t invested or have underinvested in digital infrastructure.
What CEOs really need from their CIOs is adaptable strategic guidance that will help them pivot very quickly when disruption occurs. CEOs also need their CIOs to make wise IT investment decisions that optimise costs, save jobs, stabilise operations, and shift IT resources to these strategic initiatives. In order to do this, CIOs need to be at the table with the CFO, CEO, and CPO in setting the broader business strategy that drives the IT roadmap.
CIOs must innovate on limited budgets
Today, nearly every company in the world has a financial problem. Either revenue streams have been disrupted on the front end or cash supplies are tighter on the back end. This happened in 2000 with the dot-com meltdown, and again with the mortgage crisis in 2008. However, those scenarios were not the same as the shock that has happened today in such a very short period of time. Every business and every market segment is being impacted by the current global disruption. Governments are spending more money to keep their economies going than they can ever hope to repay. When there is revenue disruption, everything changes. Unfortunately, it takes longer for a company to cut costs than it takes for revenue to fall.
The first order of business, then, is survival. In order to survive, companies must be focused on cash. Many companies are slow-paying or trying to renegotiate with every vendor they have. Scrutiny is being focused on “need to have”, let alone “like to have” or “nice to have” projects and services. Companies are competing for a smaller pool of cash, and prioritisation is top of mind for CIOs who must innovate while keeping current systems operational. Many IT projects are being cancelled. For example, ERP refreshes that don’t bring real value in terms of competitive advantage or growth are being pushed off a year or more until companies rebuild their cash supplies.
The CIO can be the most influential person right now when it comes to IT purchasing decisions. To survive, and ultimately thrive, CIOs need to look at their spending pool and immediately make changes that align revenue on the front end with costs on the back end. In the current market, this means shifting budgets to investments in digital technologies and services that support the company’s innovation strategy, even amid uncertainty. When as much as 90 per cent of the CIO’s budget is spent on ongoing operations and enhancements to back-end systems, this leaves as little as 10 per cent for investments in the company’s innovation strategy.
The goal should be to change back-end systems as little as possible – their level of maturity makes them low-risk for breaking, so the support services needed are primarily tax and regulatory updates, and advice and counsel which can all be obtained from independent, third-party providers at a fraction of the cost paid to the enterprise software vendors.
CIOs instead should optimise the operating costs of back-end systems such as ERP and shift those funds to an IT innovation investment strategy that supports the CEO’s digital priorities. Investments in front-end systems of engagement (where important interactions with customers occur) will more likely address the CEO’s business strategies. These dynamic, customer-facing systems are where constant change is needed in order to remain competitive and where digitalisation will help attract and retain customers – the keys to surviving and thriving.
For more information, please visit riministreet.com
CEOs need their CIOs’ help more than ever as they navigate through today’s global market turbulence, says Rimini Street’s Seth Ravin.
David Rose, University of Reading and Charlotte-Anne Chivers, University of Gloucestershire
Depending on who you listen to, artificial intelligence may either free us from monotonous labour and unleash huge productivity gains, or create a dystopia of mass unemployment and automated oppression. In the case of farming, some researchers, business people and politicians think the effects of AI and other advanced technologies are so great they are spurring a “fourth agricultural revolution”.
Given the potentially transformative effects of upcoming technology on farming – positive and negative – it’s vital that we pause and reflect before the revolution takes hold. It must work for everyone, whether it be farmers (regardless of their size or enterprise), landowners, farm workers, rural communities or the wider public. Yet, in a recently published study led by the researcher Hannah Barrett, we found that policymakers and the media and policymakers are framing the fourth agricultural revolution as overwhelmingly positive, without giving much focus to the potential negative consequences.
The first agricultural revolution occurred when humans started farming around 12,000 years ago. The second was the reorganisation of farmland from the 17th century onwards that followed the end of feudalism in Europe. And the third (also known as the green revolution) was the introduction of chemical fertilisers, pesticides and new high-yield crop breeds alongside heavy machinery in the 1950s and 1960s.
The fourth agricultural revolution, much like the fourth industrial revolution, refers to the anticipated changes from new technologies, particularly the use of AI to make smarter planning decisions and power autonomous robots. Such intelligent machines could be used for growing and picking crops, weeding, milking livestock and distributing agrochemicals via drone. Other farming-specific technologies include new types of gene editing to develop higher yielding, disease-resistant crops; vertical farms; and synthetic lab-grown meat.
These technologies are attracting huge amounts of funding and investment in the quest to boost food production while minimising further environmental degradation. This might, in part, be related to positive media coverage. Our research found that UK coverage of new farming technologies tends to be optimistic, portraying them as key to solving farming challenges.
However, many previous agricultural technologies were also greeted with similar enthusiasm before leading to controversy later on, such as with the first genetically modified crops and chemicals such as the now-banned pesticide DDT. Given wider controversies surrounding emergent technologies like nanotechnology and driverless cars, unchecked or blind techno-optimism is unwise.
We mustn’t assume that all of these new farming technologies will be adopted without overcoming certain barriers. Precedent tells us that benefits are unlikely to be spread evenly across society and that some people will lose out. We need to understand who might lose and what we can do about it, and ask wider questions such as whether new technologies will actually deliver as promised.
Robotic milking of cows provides a good example. In our research, a farmer told us that using robots had improved his work-life balance and allowed a disabled farm worker to avoid dextrous tasks on the farm. But they had also created a “different kind of stress” due to the resulting information overload and the perception that the farmer needed to be monitoring data 24/7.
The National Farmers’ Union (NFU) argues that new technologies could attract younger, more technically skilled entrants to an ageing workforce. Such breakthroughs could enable a wider range of people to engage in farming by eliminating the back-breaking stereotypes through greater use of machinery.
But existing farm workers at risk of being replaced by a machine or whose skills are unsuited to a new style of farming will inevitably be less excited by the prospect of change. And they may not enjoy being forced to spend less time working outside, becoming increasingly reliant on machines instead of their own knowledge.
Power imbalance
There are also potential power inequalities in this new revolution. Our research found that some farmers were optimistic about a high-tech future. But others wondered whether those with less capital, poor broadband availability and IT skills, and access to advice on how to use the technology would be able to benefit.
History suggests technology companies and larger farm businesses are often the winners of this kind of change, and benefits don’t always trickle down to smaller family farms. In the context of the fourth agricultural revolution, this could mean farmers not owning or being able to fully access the data gathered on their farms by new technologies. Or reliance on companies to maintain increasingly important and complex equipment.
The controversy surrounding GM crops (which are created by inserting DNA from other organisms) provides a frank reminder that there is no guarantee that new technologies will be embraced by the public. A similar backlash could occur if the public perceive gene editing (which instead involves making small, controlled changes to a living organism’s DNA) as tantamount to GM. Proponents of wearable technology for livestock claim they improve welfare, but the public might see the use of such devices as treating animals like machines.
Instead of blind optimism, we need to identify where benefits and disadvantages of new agricultural technology will occur and for whom. This process must include a wide range of people to help create society-wide responsible visions for the future of farming.
The NFU has said the fourth agricultural revolution is “exciting – as well as a bit scary … but then the two often go together”. It is time to discuss the scary aspects with the same vigour as the exciting part.
David Rose, Elizabeth Creak Associate Professor of Agricultural Innovation and Extension, University of Reading and Charlotte-Anne Chivers, Research Assistant, Countryside and Community Research Institute, University of Gloucestershire
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Technology is opening up new ways of farming – but we need to balance the drawbacks as well as the benefits before we take the plunge.
One of the new fads in tech discourse is “combinatorial innovation”. According to this rule, if emerging new technology has a rich set of components, these sub-technologies will keep cross-pollinating each other and combine into new products as innovators work through all the possibilities.
Recent examples of combinatorial innovation include combining edge computing with 5G, or cryptocurrencies with green energy, and artificial intelligence with IoT, to create AIoT.
An area increasing in significance recently, as a result of combinatorial innovation, is machine vision. This has been a key component of IoT and its industrial strand, IIoT. But its usefulness plateaued until advanced technologies of machine learning, especially deep learning, emerged.
To give another example, deep neural networks (DNNs) couldn’t come into their own without graphic processing units (GPUs), similar to the one your computer uses to process images and video. However, their multitasking capacity gives them far more potential than just providing the graphics for your mobile phone, PC or Xbox – they’ve now become essential to enhancing the performance of deep learning systems too.
But where is the multitasking excellence of GPUs put to good use in machine learning? In the case of “Convolutional Neural Networks” (CNN) – the type of deep neural networks ideal for analysing visual imagery – using GPUs can produce four times as much processing power than hardware without a GPU would.
CNNs are simply a type of neural network that employ a “convolution”. When being used to analyse an image, the CNN first extracts the key features in a picture (in a picture of a cat, the face, legs or ears, for example). Then, in the classification stage, it establishes the probability of each feature of the picture being what the algorithm predicts it to be. For example, in the case of a cat’s mouth, there might be a slight probability that it’s a dog’s mouth, with also some chance, albeit tiny, that it is a hat or a mug. The system can then take a decision based on these probabilities, such as recognising that an image has a certain value such as a name (such as “cat”) or even a quality (such as “faulty”, “ripe” or “too large”).
How do CNNs take machine vision and quality inspection to the next level?
Fault detection by machines is a rather complex affair. The algorithm has to recognise edges, chinks, burrs, broken seams and all sorts of unpredictable anomalies in manufactured products. When done manually, programming algorithms that can recognise all possible types of errors involves the analysis of hundreds of thousands of individual images.
Meanwhile, DNNs can learn error features by themselves and – based on those – define each and every problem class accurately. Also, self-learning algorithms’ error margins can get very close to 0 per cent, whereas manual programming’s is about 10 per cent on average. Moreover, the higher accuracy of new algorithms is further enhanced by technological advances in industrial image capturing such as stereo cameras.
Use-cases of CNN-enabled computer vision abound outside defect inspection too. Automated lane detection and sign reading in cars, identifying diseases in healthcare, automated damage analysis of assets and crops in the insurance industry are all solutions that we already encounter in our daily lives. Motion tracking and object navigation are also of key importance in factory settings, where robots and humans increasingly work together in an environment that requires exceptionally strict health and safety monitoring.
But computer vision is still far from maturity and manufacturers can’t reach plug-and-play solutions off the shelf just yet. Pre-trained networks, however, which already know the basics of identifying image features, can give a major boost to their adoption. With pre-trained solutions, for example, customising CNNs to specific factory applications typically takes a couple of hundred – rather than tens of thousands of – images from each error class and only a basic level of in-house machine-learning expertise.
Since the proliferation of sensors in corporate, industrial and urban environments started, there have been debates about whether the terabytes of data generated this way should be captured at all without a clear understanding of what this tsunami of information can be used for. The latest technological developments in computer vision and machine learning can demonstrate how new combinations of revolutionary digital technologies can give purpose to what only yesterday looked like pointless data hoarding.
The examples illustrating the workings of Convolutional Neural Networks were taken from FreeCodeCamp, a non-profit website offering free training and self-learning opportunities in coding.
Production line fault detection has often thrown a spanner into the works – but innovations in machine learning are improving it at pace, writes Zita Goldman.
In the wake of the Covid-19 pandemic, it seems traditional factory floor practices and work ethics throughout value chains are being re-engineered, as manufacturing companies increasingly adopt digital technologies which incorporate automation and artificial intelligence.
The current economic environment is becoming increasingly volatile from both directions, and manufacturing companies must face the challenge of how to best manage supply-side obstacles during a difficult period where demand continues to rise.
Firms across all sectors have been obligated to implement digitalisation to remain relevant and grow business, and both customers and employees are having to adopt technology such as virtual meeting platforms in order to continue to collaborate and engage with their stakeholders.
It seems evident that the pandemic has shortened technology adoption from years to weeks. Different companies, of course, are at different stages of maturity when it comes to the idea of digitalisation. We’re transitioning from a stage where digital technologies are invented and deployed to an era where technologies themselves are continuously developing, learning and evolving.
Many companies benefitted from automation during the initial lockdown period – for example, Hero MotoCorp in India moved to Robotic Process Automation (RPA), in which its transfer of factory goods is completed entirely by robots. These automated systems are not really “intelligent” because they’re programmed so that a given input produces a given output. However, AI expands the scope of what automated systems can do.
Artificial intelligence can assist a compression of the supply chain through harnessing the power of data with thorough analysis to achieve better demand prediction, leading to a more refined supply process. It is apparent that the most important value-driver of artificial intelligence and automation technology isn’t the technology itself. It’s where it is deployed. With the right use of AI technology, businesses can translate data directly into an increase in sales. Through monitoring customer behaviour using particular algorithms, AI can assist in enhanced consumer knowledge, resulting in a near-immediate improvement of sales. Companies can become smarter about which products they are advertising, and tailor their product recommendations using the data collected. Along with this, improvements in reputation and profitability can all be achieved through better use of AI, so it almost becomes a no-brainer for businesses to implement these technologies.
Like the pandemic, digital transformation can be categorised into waves. The first wave – which many are still within – focuses on products, services and processes becoming digitalised. The second wave of digitalisation involves using artificial intelligence to improve and build on the quality of decisions made thereafter, based on the data collected by these digitalised systems – which in turn helps to optimise organisational efficiency.
What does the future hold? Greater visibility to say the least. In summary, all these pros of automation and artificial intelligence amount to an increase in visibility throughout the whole supply chain right through to consumer preferences. Whether it’s clinical trials of a post-pandemic vaccination, or travel and transport habits, in years to come AI can help make all these processes that extra bit more refined.
Has AI or automation helped your business prevail in these difficult times? If you want to discover how similar companies managed to achieve this, get in touch with us at www.gambica.org.uk to find out more.
by Nikesh Mistry, Sector Head – Industrial Automation, Gambica
Covid-19 has forced companies to innovate their supply chain operations in order to survive, says Gambica’s Nikesh Mistry.