Helping SMBs move to the cloud

cloud-computing-626252_640The increasing adoption of cloud computing by businesses doesn’t appear to have convinced many small and medium businesses (SMBs) in the UK to move their business application from on-premise to the cloud. The three common concerns mentioned are:

  • security of the cloud, particularly data privacy,
  • complexity of migration, the amount of time is takes to migrate, and the downtime while migrating,
  • cost of migration, with many believing that the costs are high.

Although I haven’t found the evidence, I suspect the same reluctance holds true for SMBs in other countries. The question is – are these concerns valid, and how can SMBs mitigate their concerns and the risks. Continue reading

Don’t forecast 10 years ahead

tarot-991041_640An article on Medium written at the end of 2015 tries to predict how we will be living in 2025. The problem with predicting so far out, and ten years is far out, is that we cannot possibly know how things we haven’t even thought about will dramatically impact our lives.  It got me thinking what someone in January 2006 would have missed when predicting how we would be living in 2016.

The smartphone/tablet revolution

In 2006, the dominant global cellphone manufacturers were Nokia and Blackberry. The term ‘crackberry’ had been coined to describe how addictive the device was. The dominant personal computing device was a PC or a laptop running Windows XP.

In 2007, Apple introduced the first iPhone; in 2010, the iPad came out. They were both instant successes. In 2007, Microsoft launched Windows Vista and seriously damaged its reputation as a software provider.

No one in 2006 would have foreseen that Nokia and Blackberry would miss the uptake of smartphones, that Microsoft would fail in grabbing a significant share of the smartphone and tablet markets, and how these new technologies would revolutionize business and personal computing.

  • The terms BYOD (Bring Your Own Device) and ‘consumerization of IT’ arose from the preference of people for using their own, non-Microsoft, devices at work because the user interface and experience was better than that of the corporate Windows environment.
  • We now talk about ‘mobile-first’ application development, and try to make sure our websites are ‘responsive’, so that people will find it easy to use apps on their smartphones and view websites.
  • The mobile app became the preferred way to access products and services via the Internet, and subordinated the browser to a secondary role.

Social media

In 2006, people would discuss their views and experiences via a blog. The way you made contact electronically with people was by phone or email. You could send short messages to people using products like MSN Messenger. No one could have foreseen how two companies would change things.

Facebook had been in existence as a closed network for universities since 2004. In September 2006 it was extended to anyone with a registered email address, and by 2007 it had begun to change the way people kept in touch.

In July 2006, Twitter was launched, and while it took a while for people to catch on, it soon became the de facto source for breaking news and information sharing for a global community – restricted to messages of 140 characters or less. (I was an early user, my user number is 3011. You can find your Twitter number at http://www.idfromuser.com/).

  • Although the concept of global communities was not new – the worldwide web had been around for over 10 years – Facebook and Twitter introduced the world to the phenomenon of social media. Soon, companies would start having to introduce guidelines for social media usage.
  • Within a few years, businesses were looking for a Facebook- or Twitter-like experience for communication and collaboration for the enterprise.
  • Facebook became the first tech company that could challenge Google as an advertising platform.

Cloud computing

Both Amazon and Salesforce were well-known businesses in 2006, but few people could foresee how cloud computing would change the business and personal software space. It was in 2006 that Amazon launched AWS and Salesforce released Force.com, as platforms on which developers could create and maintain applications that ran in the cloud. The effect of these new platforms was seen in the scramble that started within a few years among the major enterprise software companies to build or acquire cloud applications.

  • Terms such as PaaS (platform-as-a-service), IaaS (infrastructure-as-a-service) and SaaS (software-as-a-service) were created to describe the new types of applications that cloud software could provide.
  • Who could have predicted that a new CEO at Microsoft (Satya Nadella) would change the company’s strategy to support competitive platforms because of the proliferation of cloud apps, and the negative impact the cloud had on Microsoft’s classic on-premise business.

Tech companies

Back in 2006 it was easy to see that Google was the new leading player in the tech space, but no one could have guessed how far ahead Apple would become in market capitalization. In the early 2000s, some large software companies had begun acquiring other companies – Microsoft with Great Plains in 2001, Oracle with Peoplesoft/JD Edwards in 2004 – but by 2010 the major enterprise software companies had gone on an acquisition binge. Despite their size and reach, these software behemoths were unable to anticipate changes in software requirements and so had to buy products in order to compete.

  • In 2006, no one would have predicted how much of shareholders’ and customers’ money would been spent by enterprise software companies on acquisitions.

Looking further back

It occurred to me what people in previous decades might also have missed.

  • 1996 – would not have been able to foresee the growth, dominance and influence of Google ten years later.
  • 1986 – with companies like IBM and DEC as dominant players, would predictions have missed the impact of the then recently released Windows operating system, and the personal computer revolution.

Conclusion

If you are going to make predictions about how people will live or business will work, my recommendation is that you keep it to a reasonable time frame. Five years is probably a safe horizon to predict, but any longer and you are likely to be horribly wrong.

Gartner’s End User Predictions from 2010

This is what Gartner predicted for end users in 2010. What do you think now?

  1. By 2012, 20% of businesses will own no IT assets.
  2. By 2012, India-centric IT service companies will represent 20% of the leading cloud aggregators in the market.
  3. By 2012, Facebook will become the hub for social networks integration and Web socialization.
  4. By 2014, most IT business cases will include carbon remediation costs.
  5. In 2012, 60% of a new PC’s total life greenhouse gas emissions will have occurred before the user first turns the machine on.
  6. Internet marketing will be regulated by 2015, controlling more than $250 billion in Internet marketing spending worldwide.
  7. By 2014, more than three billion of the world’s adult population will be able to transact electronically via mobile and Internet technology.
  8. By 2015, context will be as influential to mobile consumer services and relationships as search engines are to the Web.
  9. By 2013, mobile phones will overtake PCs as the most common Web accessdevice worldwide.

Review of the Gartner ERP Magic Quadrant

magic-quadrantIt’s been six years since I posted my views on the Gartner ERP magic quadrant for Tier 2 vendors. It has been one of the most viewed posts on my blog, but I think it’s now time to have a relook at the ERP magic quadrant (MQ) and the ERP market as a whole.

For reasons that don’t seem to make sense Gartner have changed a number of points about this MQ category.

  1. It’s now called “Single-Instance ERP for Product-Centric Midmarket” which changes the whole category completely
  2. The term ‘product-centric’ was added, but since ERP is a product this new term would seem to be an oxymoron,
  3. They now don’t differentiate between Tier 1 (ie, expensive for large corporations) ERP and Tier 2 (for midmarket companies), which was why the original MQ was useful for midmarket companies.

The obvious Tier 1 ERP products are SAP, Oracle and now Microsoft Dynamics AX. The growing issue in the ERP market however is that other vendors might be called Tier 1.5, ie, getting there, like Infor; not forgetting of course about brand new entrants like Workday. The old Tier 2 vendors are still there – QAD, SYSPRO, JDEdwards, Epicor, Sage – but now up-and-comers are appearing – NetSuite, Plex, Acumatica. Why not have a category for the Tier 1.5 and Tier 2 vendors?

So, here are my views on the ERP MQ, and where ERP is going.

  • The ERP sector is getting to resemble the old days of the big car manufacturers – eg GM, Ford, VW, Toyota – who didn’t realize that there was a new bunch of manufacturers emerging – eg, Hyundai, Kia, Mahindra – who would take a large slice of the market. That’s leaving out the brand new set of competitors with disruptive technology like Tesla. IDC analyst Mike Fauscette recently commented about incumbent business software vendors:

your business is at risk more from new models that can appear almost overnight than from new products/services or changes in pricing … A large part of the future of business software is tied up in the new platforms (PaaS) that are forming and evolving today.

  • 10 years ago companies needed the MQ to get a insight into the category. Now it’s easier, quicker and cheaper to search the Internet these days to find out about ERP vendors. You may only need to refer to the MQ if your board needs placating.
  • What’s the value of ERP? Aberdeen research pointed out the the prime reason for leading companies to implement is to “serve as a complete and auditable system of record”.  ERP therefore should be your system of record, the ‘single source of truth’ for financial reporting and auditing, but it doesn’t have to do everything else. For the industry-specific line-of-business functions – customer support, supply chain, etc – consider best-of-breed and cloud. The way you make it work is by integrating those apps back to the ERP. Another post by Mike Fauscette predicts:

As software as a service (SaaS) replaces old monolithic systems, the need for the older application packaging becomes irrelevant. The new reactive systems are a loose coupling of microservices based on a business process not a collection of monolithic application blocks that can be integrated into a “suite”.

  •  The ERP vendor market will change significantly in the next 5-10 years, because the use of technology for business transactions and processes will change due to the growth of cloud computing. Gartner is now using the term ‘postmodern ERP‘, saying that cloud vendors are now disrupting the old ERP providers. When your company invests in an ERP, bear that in mind.
  • For ERP vendors, they need to stop thinking that customers value ‘an integrated business management system’. They used to in the past when there wasn’t any alternative, now they want something that benefits the part of the business they need to fix or transform, the rest is unnecessary fluff. A recent PWC report by uses the term ‘hybrid ERP for applications that are fit-for-purpose and cloud-based, not be the highly complex, expensive ones of the past.

Traditional ERP systems have long been criticized for their high cost, lack of flexibility, and difficulty of implementation. Hybrid ERP assemblages of modular components, on the other hand, typically cost less, allow companies to pick and choose among their various functions, and offer greater mobility.

The ERP landscape is going to go through a period of disruption and change in the coming years. As a customer or future customer of an ERP vendor, make sure you don’t lock yourself in for too long, or spend too much time and money on lots of specific customizations. There will be options in the next few years that many businesses don’t even realize yet. Keep your options open by leaving some of that ERP investment for better tech spending in the future.

Taking the steam out of STEM

Taking the steam out of STEM

stemFor the last few years, there have been many calls and articles about the importance of STEM – Science, Technology, Engineering, Mathematics. I’ve seen the acronym used most often in US-based communications, but it’s seen as an issue in other countries as well; for example, in South Africa we tend to talk about “Maths and Science”. The STEM proponents say the growing these skills is the only way we will be able to ensure employment in a world that is becoming more automated and smarter. But I am now hearing a different view, that there is more to work and life than maths and science skills.

In 2007 Geoffrey Moore started arguing a new case – the revenge of the liberal arts graduates – saying that as we move from computing to communications, this new connected world will place

an enormous premium on people who are fluent in communications.

It took a while to catch on, but in 2015 there have been a number of articles that are pushing the ‘arts’ argument.

An article in the Harvard Business Review titled Build STEM Skills, but Don’t Neglect the Humanities argued that we need a STEMMA focus – the M referring to management, and A to Arts.

how could it be beneficial to the future to deemphasize the arts, which inform our knowledge of beauty and meaning in human affairs? All the brilliant discoveries of STEM will not solve the grand challenges of today’s world — ignorance, poverty, intolerance, and political conflict – without the practical wisdom of humanities-trained leaders.

… It is only through the humanities that we will be able to appreciate the answers that superintelligent computers will give us when we ask them the hard questions. It is only through the humanities that we will increasingly recognize and build on what we humans uniquely are. It is through STEM plus MA progress that we have the chance to become practically wise.

Analyst Vinnie Mirchandani on his New Florence blog alerted me to an interview with a NASA astronaut Leland Melvin who uses the term STEAM (A=Arts).

My passion with the “A” is that it helps us be inclusive in school and also promotes project-based learning … Project-based learning is what prepares students for real-world problem solving

The issue was discussed in another Harvard Business School article which asked the question:

As machines increasingly perform complex tasks once thought to be safely reserved for humans, the question has become harder to shrug off: What jobs will be left for people?

and answered it with:

it’ll be those that require strong social skills — which it defines as the ability to work with others — something that has proven to be much more difficult to automate.

citing research that showed:

nearly all job growth since 1980 has been in occupations that are relatively social skill-intensive … high-skilled, hard-to-automate jobs will increasingly demand social adeptness.

It appears that social skills are still important.

  1. They are valued in jobs across the entire wage distribution
  2. Social and cognitive skills are complementary not competing
  3. Work that only needs low levels of social skills are also likely to be routine jobs that have a high risk of automation.

Taking somewhat of a middle ground was a 2014 article on Medium that discussed the essential skills needed to survive the new future of work. Technology was one needed skill, but problem solving and self-management were the other essential skills.

As someone whose degree was more in arts than science, and got into the tech industry almost by accident, I am glad to see that arts are starting to make a comeback. The difficulty from a personal perspective is that I am surrounded at home by family members with engineering and medical backgrounds who believe that an arts education is a wasted one. Next time the issue comes up, I will try to use the points raised in this blog.

Bimodal IT doesn’t mean complexity

bimodal_itFor over a year the Gartner analyst group has been talking about the need for ‘bimodal IT’. An article in InformationWeek described it as the need for an IT organization to

split its focus between the core services that make other things possible and the more exciting possibilities of digital innovation.

When I first heard of bimodal IT I was still working in the ERP software space. Coming from that traditional background I used to think that any bimodal IT effort had to be somewhat big and complex. But now I’m at a cloud software company I’m beginning to see things differently.

According to a recent study, 44% of enterprises expect cloud computing to help launch new business models, and this will increase to 55% by 2018. Moving some of your enterprise apps to the cloud may therefore give you one bimodal IT project.

There will be cases however where an IT department needs to develop apps. The growth of platforms-as-a-service (PaaS) offer higher productivity gains than traditional on-premise developments environments, and these don’t just enable faster starts but also quicker results.

Then there was something I learnt recently about the power of the much used, and also much maligned, product – Excel. If people are going to organize, format, tabulate or calculate data, they will most probably use an Excel spreadsheet. Excel formulas are a standard business application logic than everyone understands. But Excel is also architected for integration because it is stored in a standards-based XML-based format. This means that using an PaaS for integration, business users can create sophisticated reporting and analysis solutions using a tool they already know and understand using data from other sources. This is something we wrote about on the company blog recently – making Excel awesome.

So bimodal IT doesn’t have to be big and expensive, nor does it only need to depend on IT. I would be interested to learn of other options that make bimodal IT easier and quicker.

Why monolithic systems of record will disappear

cloud-computingI thought that having left the ERP industry I would not have any reason or inspiration to write about it, but I was wrong. My experiences since I started working in the cloud application market have led me to believe that the era of the monolithic systems of record, as typified by ERP, might be coming to an end.

When I started in the ERP field as it was taking off in the late 1990s, only large organizations could afford, or saw the rationale for, ERP. It took another ten years for ERP systems to penetrate most sizes of business. Now, because moving off ERP is a mindset change, and organizations have invested hugely in enterprise software, I don’t believe that change will happen overnight. But as the economics and business benefit of cloud become more apparent, and a younger generation assume decision-making roles, it could occur over the coming decade. Several writers have commented that the growth of cloud computing bears many similarities to the expansion of electrification in the early 1900s.

Here some of my reasons for believing that monolithic systems will disappear.

1. You don’t need an ERP as single source for company truth
Many organizations use specialized applications for key customer-facing operations, and increasingly these are SaaS (software-as-a-service) systems. With modern iPaaS (integration platform-as-a-service) apps, software integrations can allow different systems to be a system of record for a set of business functions. The only system that needs to be centralized is accounting, and with companies are already using iPaaS to synch operational data into the accounts system.

2. Legacy systems diminish the ability to innovate
According to a recent survey, 90% of IT decision-makers say legacy systems prevent them from adopting new digital technologies that they need.

3. The push to digital disruption requires new approaches
An article by Constellation Research explains that ERP arose out of the business process re-engineering movement and the disruptive innovation caused by the PC. In its day the emergence of ERP software disrupted the existing business model, but there is now the new disruptive influence arising as a result of cloud and mobile computing. It would be unlikely that a new company these days would opt for an on-premise ERP; more likely it would look for a SaaS solution. “New business model = new ERP model!”

4. The cloud is now acceptable for many business-critical solutions
Analyst companies are reporting a change in the way the cloud is perceived. Gartner notes that fears about the suitability of the cloud for critical applications is now a thing of the past. Forrester Research found that agility is the most important driver of cloud adoption. The fact that many cloud providers are now certified for compliance and security regulations has also made them more acceptable.

5. Ease of SaaS upgrades
SaaS software has the advantage of “bite sized, frequent, managed- by-the-vendor upgrades“, compared to the headaches of managing on-premise upgrades. I am noticing a proliferation of SaaS applications with a fairly narrow industry focus. What enables that is that is a relatively easier to develop and maintain cloud-based applications (compared to on-premise), and SaaS subscription model enables profitability at an earlier stage than the traditional license fee model.

6. ERP consultant recommends alternatives
When ERP consulting firm Panorama recommends alternatives to typical ERP solutions you start to ask whether ERP may be close to its sell-by date.

Companies are increasingly being made aware of the possibility of digital disruption to their business. It’s difficult to cope with a disruptive business model that requires agility by staying with a monolithic legacy system like an ERP.