The Curious Case of Salesforce and Workday: Data Integration in the Cloud
The growing enterprise adoption of Salesforce SFA/CRM, Workday HR, Netsuite ERP, Oracle on Demand, Force.com for apps and Amazon Web Services for e-commerce will result in more fragmented enterprise data scattered across the cloud.
Automating the moving, monitoring, securing and synchronization of data is no longer a “nice-to-have” but “must-have” capability.
Data quality and integration issues — aggregating data from the myriad sources and services within an organization — are CIOs and IT Architects top concern about SaaS and the main reason they hesitate to adopt it (Data security is another concern). They have seen this hosted data silo and data jungle problem too many times in the past. They know how this movie is likely to unfold.
Developing strategic (data governance), tactical (consistent data integration requirements) or operational (vendor selection) strategies to deal with this emerging “internal-to-cloud” data quality problem is a growing priority in my humble opinion. Otherwise most enterprises are going to get less than optimal value from various SaaS solutions. Things are likely to get out of control pretty quickly. Read more 
Do your KPIs Reflect Business Insights?
Obsolete KPIs can be Lethal
In the Aesopian fable of the one-eyed stag, a deer overcomes his visual handicap by grazing on a cliff near the sea with his good eye facing the land. Since all his known dangers were on land, this keeps him safe from predators for a very long time – until he is killed by a hunter in a boat.
The relevance of our KPIs can make or break our business. KPIs are often defined as static metrics for an enterprise and can easily become outdated. Economic uncertainty and competitive pressures are prompting questions on the validity of KPIs and performance management processes. To stay competitive requires a process of continually validating metrics with the business environment.
Another common challlenge with KPIs is that there are too many of them. Modern technology has gven us the ability to measure a very large number of parameters in the business. Some of these are more relevant than others. Jack Welch is known to have said, ”Too often we measure everything and understand nothing”. Monitoring some metrics and ignoring others are decisions we make based on our business perspective.
Relevance Enabled by Process
How do you decide on which KPI’s are most relevant to success? An often overlloked first step is to understand that primary business goals before looking at the technology solution. Avinash Kaushik defines KPIs simply as “Measures that help you understand how you are doing against your objectives”. This fundamental aproach is a good way of weeding out items which are not relevant to what we want as a business and avoid adverse surprises. At a more deeper level, building a robust Business Analytics solution requires answers to questions such as:
1. What events have the greatest impact on the busiens and how are they measured?
2. How often do you validate that you are measuring the right parameters ?
3. What instrumentation do you need to create the right dashbords for your KPI’s ? Can this instrumentation be updatd as teh KPIs change?
4. What is the process for collecting, synthesizing, manipulating and presenting the data to represent thsese metrics? How does the process change when if the metric change?
5. What technologies and architecture are necessary to support those decision-making patterns? Is there need for a “single source of truth” or a federated model possible?
Centers of Excellence
Needless to say, this approach requires a tight inegration between the business owners and IT acrchitects. A recent study by Gartner says that ”IT collaboration initiatives fail because IT leaders hold mistaken assumptions about basic issues…..rather than making technology the starting point, IT leaders should first identify real business problems and key performance indicators (KPIs) that link to business goals.”
Many business executives believe that IT is unable to deliver results where it counts. At the same time, IT organizations spend an incredible amount of time, money and resources simply reporting obvious data within their business process and workflows.
An organizational solution to this problem is the creation of a Competency Center or Center of Excellence (CoE) with representation from from both business and IT and shared objectives. The CoE defines the blueprint for implementing BI, Performance Management and Analytics aligend with KPIs. Some of the obvious benefits include:
- Cost savings from eliminating Silos
- Better collaboration between Business and IT
- Joint ownership of corporate objectives
There are other aspects of the CoE which make it a practical approach to creating an effective vehicle for deploying analytics solutions. The sheer volume and texture of busienss data is much more complicated than it has ever been in modern busienss history. The world’s data doubles every two years creating more opportunities for analyses. Understanding this data even at an aggregate level requires a business perspective combined with technological expertise. Furthernore, understanding technologies such as Big Data for unstrcutured data analysis requires business leaders and IT eimplementors to work together.
The CoE is the ideal structire to implement a Business Perspective Solution. A well implemented Business Perspective Solution takes into account the key objectives of the busienss, leverages sophisticated analytics technologies and focuses on sustainable processes to support decision making in an organization.
Superior decisions based on business perspective separate winners from losers.
Are your KPIs in sync with your business perspectives? Please share your comments below.
Further Reading
1. Six Web Metrics / Key Performance Indicators To Die For by Avinash Kaushik, Occam’s Razor
2. Practical BI – What CEOs want from BI and Analytics by Ravi Kalakota, Business Analytics 3.0
3. The Stupidity of KPIs in Business Analytics by Mark Smith, Ventana Reasearch
Analytics-as-a-Service: Understanding how Amazon.com is changing the rules
“By 2014, 30% of analytic applications will use proactive, predictive and forecasting capabilities” Gartner Forecast
“More firms will adopt Amazon EC2 or EMR or Google App Engine platforms for data analytics. Put in a credit card, by an hour or months worth of compute and storage data. Charge for what you use. No sign up period or fee. Ability to fire up complex analytic systems. Can be a small or large player” Ravi Kalakota’s forecast
—————————-
Big data Analytics = Technologies and techniques for working productively with data, at any scale.
Analytics-as-a-Service is cloud based… Elastic and highly scalable, No upfront capital expense. Only pay for what you use, Available on-demand
The combination of the two is the emerging new trend. Why? Many organizations are starting to think about “analytics-as-a-service” as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.
Analytics-as-a-Service is an umbrella term I am using to encapsulate “Data-as-a-Service” and “Hadoop-as-a-Service” strategies. It is more sexy 🙂
The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency. Executing this goal requires developing state-of-the-art capabilities around three facets: algorithms, platform building blocks, and infrastructure.
Analytics is moving out of the IT function and into business — marketing, research and development, into strategy. As result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms. In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers. Then another several months to load, configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.
Big Data, Analytics and KPIs in E-commerce and Retail Industry
- How to convert Lookers to Bookers…
- How to create unique and effective Digital Experiences that impact probability of purchase or likelihood of return.
- What offers might result in higher “take rates”
The change in consumer behavior and expectations that e-commerce, mobile and social media are causing is hugely significant – big data and predictive analytics will separate brand/retail winners from losers. This won’t happen overnight but the transformation is for real.
Retail Industry makes up a sizable part of the world economy (6-7%) and covers a large ecosystem – E-commerce, Apparel, Department Stores, Discount Drugstores, Discount Retailers, Electronics, Home Improvement, Specialty Grocery, Specialty Retailers and Consumer Product Goods suppliers.
Retail is increasingly is looking like a barbell – a brand oriented cluster at the high-end, a very thin middle, and a price sensitive cluster at the low end. The consumerization of technology is putting more downward pricing pressure in an already competitive “middle” retail environment. The squeeze is coming from e-commerce and new “point, scan and analyze” technologies that give shoppers decision making tools — powerful pricing, promotion and product information, often in real-time. Applications in iPhones and Droid, like Red Laser can scan barcodes and provide immediate price, product and cross-retailer comparisons. They can even point you to the nearest retailer who can give you free shipping (total cost of purchase optimization). This will lead to further margin erosion for retailers that compete based on price (a sizable chunk of the market in the U.S, Europe and Asia).
Data analytics is not new for retailers. Point of sale transactional data obtained from bar-codes first appeared in 1970s. A pack of Wrigley’s chewing gum was the first item scanned using Universal Product Code (UPC) in a Marsh Supermarket in Troy, Ohio in 1974. Since then, retailers have been applying analytics to get even smarter and speedup the entire industry value chain.
More recent use cases of retail analytics include: Read more 
Proactive Risk Management – New KPIs for a Dodd-Frank World
The financial crisis of 2007–2011 is driving widespread changes in the U.S regulatory system. Dodd-Frank Act addresses “too big to fail” problem by tightening capital requirements and supervision of large financial firms and hedge funds. It also creates an “orderly liquidation authority” so the government can wind down a failing institution without market chaos.
Financial institutions will be spending billions to strengthen, streamline and automate their recordkeeping, risk management KPIs and dashboard systems. The implications on Data Retention and Archiving, Disaster Recovery and Continuity Planning have been well covered. But leveraging Business Analytics to proactively and reactively manage/monitor risk and compliance is an emerging frontier.
We believe that Business Analytics and real-time data management are poised to play a huge role in regulating the next generation of risk and compliance management in Financial Services industry (FSI). in this posting, we are going to examine the strategic and structural challenges, the dashboards and KPIs of interest that provide feedback, and what an effective execution roadmap needs to be for every organization.
Read more 
Mobile BI – Business KPIs and Dashboards “on-the-go”
Who doesn’t want to achieve faster “time-to-information” and shorter “time-to-decision” for executives and managers with mobile BI? Who doesn’t want to disseminate insights or KPIs to front-line employees, such as field sales representatives, line of business managers, and field service employees?
The question is not whether Mobile BI is a good idea but how to execute this program in a low-cost way? How to design and deploy eye-popping “wow” apps? How to support, maintain and enhance these apps which are constantly changing? What technology and infrastructure to put in for a national or global deployment? Who is going to fund all this plumbing – corporate, LoB or IT?
Business Analytics solutions for “always-on” 3/4G enabled mobile devices – iPads, iPhones, tablets, smart phones – are becoming prevalent as the form factor becomes appropriate for BI. We are increasingly seeing firms build state-of-the-art dashboard solutions for iPads. The “post-desktop” apps provide senior management with intuitive interactive access to the company’s most important business KPIs and dealing with data overload.
Tablets, 4G Wireless and next gen displays (+gesture based, verbal interfaces) have enabled new productivity improvements and better ways to consume information, perform ad-hoc querying and scenario planning. Dashboard, heatmaps and scorecards on the iPad, iPhones and Androids are intuitive, attractive, powerful, available at any time and any place: a perfect mix for top managers, sales teams and even customers.
BI (and Information Management) is a natural fit for mobile devices. Managers, blue and white workers spend a majority of their time away from their desks. Most are traveling, walking about or driving from site to site. And it’s these mobile workers who need the most up-to-date information. They need mobile BI to retrieve data to make on-the-spot decisions, monitor operational processes and review KPI, and work-in-process dashboards.
Are you one of these — Data Scientist, Analytics Guru, Math Geek or Quant Jock?
“The sexy job in the next ten years will be statisticians…” ‐ Hal Varian, Google
Analytics Challenge — California physicians group Heritage Provider Network Inc. is offering $3 million to any person or firm who develops the best model to predict how many days a patient is likely to spend in the hospital in a year’s time. Contestants will receive “anonymized” insurance-claims data to create their models. The goal is to reduce the number of hospital visits, by identifying patients who could benefit from services such as home nurse visits.
The need for analytics talent is growing everywhere. Analytics touches everyone in the modern world. It’s no longer on the sidelines in a support role, but instead is driving business performance and insights like never before.
Job posting analysis indicate that market demand for data scientists and analytics gurus capable of working with large real-time data sets or “big data” took a huge leap recently. The most common definition of “big data” is real-time insights drawn from large pools of data. These datasets tend to be so large that they become awkward to work with using on-hand relational database tools, or Excel.
It’s super trendy to be labeled “big data” right now – but that doesn’t mean the business trend’s not real. Take for the instance the following scenario in B2B supply chains. Coca-Cola Company is leveraging retailers’ POS data (e.g., Walmart) to build customer analytical snapshots, including mobile iPad reporting, and enable the CPFR (Collaborative Planning, Forecasting, and Replenishment) process in Supply Chain. Walmart alone accounts for $4 bln of Coca-Cola company sales.
Airlines, hotels, retail, financial services and e-commerce are industries that deal with big data. The trend is nothing new in financial services (low latency trading, complex event processing, straight thru processing) but radical in traditional industries. In trading, the value of insights depends on speed of analytics. Old data or slow analytics translate into losing money.
As data growth in business processes outpaces our ability to absorb, visualize or even process, new talent around Business Analytics will have to emerge. New roles such as Data Scientists, Analytics Savants, Quant Modelers are required in almost every corporation for converting the growing volumes of data into actionable insights.
Look at these data stats.
Harry Potter, The Elephant, The FBI and The Data Warehouse
In the ancient Indian parable of the elephant, six blind men touch an elephant and report six very different views of the same animal. Compare this scenario to a data warehouse that is getting data from six different sources. “Harry Potter and the Sorcerer’s Stone” as a field in a database can be written as “HP and the Sorcerer’s Stone” or as “Harry Potter I” or simply – “Sorcerer’s Stone”. In the data warehouse these are four separate movie titles. For a Harry Potter fan, they are the same movie. Now increase the number of movies to cover the entire Harry Potter series and further include fifty languages. You now have a set of titles which may perplex even a real Harry Potter aficionado.
What does this have to do with data analytics?
Apple iCloud Service – Lessons for “Big Data” and BI Architects
Apple with its iCloud offering is attacking the consumer facing digital content big data problem. Big Data is challenging on many fronts from the insights (e.g., analytics and query optimization), to the practical (e.g., horizontal scaling), to the mundane (e.g., backup and recovery).
On June 6th, 2011 Apple Inc. launched its new purpose built digital locker service called iCloud for its 225 million iTunes accounts that frees the end-user from the tyranny of the device. The iCloud service is a cloud offering that would allow users to store digital files such as photos, MP3 music, videos and documents in the cloud and access them from Internet-connected devices like iPhones, iPads, iPods, iMacs and others.
So, what’s the big deal? They are addressing a classic BI data management problem: How to free up data trapped in “device and application jails” in a user-friendly way. The “scan and match” concept is quite applicable to large scale Enterprise Datawarehouses which suffer from data integrity issues as edge data capture and consumption devices proliferate.
Data ingestion, governance and management is a huge problem facing large organizations. As data volumes double every year, not having a basic data management strategy will become an Achilles heel. Most organizations unfortunately don’t know what data assets they have, where these assets are, how they are organized and how well they are secured. Apple shows a neat way to address the Big Data problem in personal cloud management.
New Tools for New Times – Primer on Big Data, Hadoop and “In-memory” Data Clouds
Data growth curve: Terabytes -> Petabytes -> Exabytes -> Zettabytes -> Yottabytes -> Brontobytes -> Geopbytes. It is getting more interesting.
Analytical Infrastructure curve: Databases -> Datamarts -> Operational Data Stores (ODS) -> Enterprise Data Warehouses -> Data Appliances -> In-Memory Appliances -> NoSQL Databases -> Hadoop Clusters
———————
In most enterprises, whether it’s a public or private enterprise, there is typically a mountain of data, structured and unstructured data, that contains potential insights about how to serve their customers better, how to engage with customers better and make the processes run more efficiently. Consider this:
- Online firms–including Facebook, Visa, Zynga–use Big Data technologies like Hadoop to analyze massive amounts of business transactions, machine generated and application data.
- Wall street investment banks, hedge funds, algorithmic and low latency traders are leveraging data appliances such as EMC Greenplum hardware with Hadoop software to do advanced analytics in a “massively scalable” architecture
- Retailers use HP Vertica or Cloudera analyze massive amounts of data simply, quickly and reliably, resulting in “just-in-time” business intelligence.
- New public and private “data cloud” software startups capable of handling petascale problems are emerging to create a new category – Cloudera, Hortonworks, Northscale, Splunk, Palantir, Factual, Datameer, Aster Data, TellApart.
Data is seen as a resource that can be extracted and refined and turned into something powerful. It takes a certain amount of computing power to analyze the data and pull out and use those insights. That where the new tools like Hadoop, NoSQL, In-memory analytics and other enablers come in.
What business problems are being targeted?
Why are some companies in retail, insurance, financial services and healthcare racing to position themselves in Big Data, in-memory data clouds while others don’t seem to care?
World-class companies are targeting a new set of business problems that were hard to solve before – Modeling true risk, customer churn analysis, flexible supply chains, loyalty pricing, recommendation engines, ad targeting, precision targeting, PoS transaction analysis, threat analysis, trade surveillance, search quality fine tuning, and mashups such as location + ad targeting.
To address these petascale problems an elastic/adaptive infrastructure for data warehousing and analytics capable of three things is converging:
- ability to analyze transactional, structured and unstructured data on a single platform
- low-latency in-memory or Solid State Devices (SSD) for super high volume web and real-time apps
- Scale out with low cost commodity hardware; distribute processing and workloads
As a result, a new BI and Analytics framework is emerging to support public and private cloud deployments.







