Making the Most of Your Data with Advanced Analytics Webinar Series

Join Protiviti’s John Harris as he presents two dynamic webinars that will help you make the most of your data using advanced analytics. From developing a customer analytics strategy to how to capture the vast amounts of information in your production systems to make prescriptive decisions, John’s sessions will give you insight into what’s happening now in data analytics.

Register for one or both! 

How CX Information Can Accelerate Your Customer Analytics Journey
Wednesday, Oct 3 @ 2:00 pm ET

Often, companies struggle with maximizing the value of the huge amount of customer data they capture and retain, yet some are leveraging customer analytics to begin tapping into the value of that information. In this session, attendees will learn how to align customer experience information with a customer analytics framework to accelerate their journey to meaningful and actionable customer insights.

Key Learning Points: 

  • Review the overall purpose and framework of a customer analytics strategy
  • Learn the difference between a Customer Analytics and Customer Experience program
  • Discover with CX information is critical to a customer analytics strategy


Leveraging Advanced Analytics & ERP Data for Production Optimization, Capacity Analysis and Scenario Production Planning
Wednesday, Oct 10 @ 2:00 pm ET

Consumer packaging and manufacturing companies capture vast amounts of information with their production ERP systems. In most cases, this information is primarily used for reporting and ad hoc queries on things that have already happened. The information is rarely used to make prescriptive decisions on how to optimize production processes and planning decisions.

This customer case study will illustrate how Protiviti demonstrated the value of leveraging advanced analytics to a large manufacturing client, using their ERP data to create optimized daily production schedules, increase plant capacity, and execute scenario planning analysis. This capability was not available from their ERP system or any of their plant-shop floor accessory software packages. Attendees will learn how Protiviti used optimization modeling to accomplish a number of objectives including reduction production coasts, determining the impact of each customer’s order patterns on resource utilization and evaluating the ability to absorb new customer demand.

Key Learning Points: 

  • Discover ways to leverage advanced analytics to maximize the value of ERP data
  • Learn how to identify analytics-based projects of utmost interest to senior management
  • See how to leverage optimization for operation and strategic planning decisions

Categories: Events

Managing Data Governance in a Cloud-Focused World

The rate at which companies are amassing data is staggering. More than half of organizations today (57%) have production workloads running in the cloud and with the amount of new devices being introduced that create, consume and transmit data to the cloud, it has become critical to have some type of cloud governance program in place. However, one of the most challenging elements of such a program is how to manage an organization’s sensitive data. This data could encompass anything from bank account and credit card numbers to HR payroll data. Misuse or negligent handling of this information could cost companies tens of thousands of dollars per record lost in a potential data breach. Besides monetary consequences, we’ve also seen how disastrous a data breach can be to customer confidence. Cloud governance is nothing to scoff at!

When the stakes are this high, it is understandable that companies are reluctant to trust the cloud. Gartner predicts that “through 2020, 95% of cloud security failures will be the customer’s fault.” However, cloud providers have made significant improvements to their security offerings over the last five years. This means that with proper planning and preparation, you can still reap the benefits of cloud efficiency and agility while maintaining appropriate levels of security.

Read more

Narjit Aujla

Getting to Excellence in Business Intelligence Webinar Series

We invite you to join experts from Protiviti’s Data Management and Advanced Analytics team for this three-part webinar series. Join us on consecutive Wednesdays in September at 2:00 pm eastern to learn how to take your BI programs to the next level.

Register for one or all three! 

A How-To Guide for Creating a Global Analytics Hub
Speaker: Marshall Kelley, Manager
Wednesday, September 12 @ 2:00 pm

Imaging bringing 15 different ERP systems into one unified analytics data mart, using SAP Data Services and SAP HANA®. In this webinar, attendees will learn how to create a global analytics hub using several SAP® solutions in a complex technological environment. This session will also cover proven methodologies for BI success and share tips with attendees on how to gain valuable insight from real-world projects.

Key Learning Points:

  • Identify systematic and repeatable best practices for driving success and increasing user adoption
  • Review real-world projects, discussing the technology, strategy and methodologies
  • Hear tips about staying agile in a global analytic environment


Hybrid Analytics: Bridging the Gap between Cloud and On-Premise
Speaker: Patrick NeSmith, Director
Wednesday, September 19 @ 2:00 pm

It seems everyone has a cloud analytics solution these days and there are many valid use cases for moving analytics to the cloud. But that doesn’t mean abandoning an existing SAP® BusinessObjects™ on-premise solution in favor of the cloud. This webinar will review use cases for both on-premise and SAP Analytics Cloud and discover how a hybrid approach can help organizations quickly and affordably provide information from a greater number of sources to a wider user base – ultimately, driving adoption and empowering users to make better decisions.

Key Learning Points:

  • Learn how to leverage your on-premise investment to get the most out of cloud analytics
  • Discover when to use on-premise, cloud, or hybrid
  • Learn how to add cloud analytics without complicated licensing


Creating a Business Intelligence Center of Excellence – A Step-by-Step Approach for Success
Speaker: Chris Hickman, Associate Director
Wednesday, September 26 @ 2:00 pm

Establishing a Business Intelligence Center of Excellence (COE) is a proven approach to achieving a strategic, cohesive, enterprise-wide BI environment. This session will walk attendees through the process of establishing a COE, an internal group that provides services and oversight to the various development groups within an organization. Attendees will learn how establishing a COE will allow them to guide BI initiatives within their firm (regardless of size or maturity) to achieve common goals. Attendees will also experience the challenges and benefits inherent in this process, along with the bottom-line results that can be expected when a successful COE is in place.

Key Learning Points: 

  • Understand how to evaluate your organization’s business intelligence maturity model
  • Assess the value opportunities of implementing a center of excellence
  • Assess the risks associated within maintaining the status quo
  • Define the typical properties of a business intelligence center of excellence

Categories: Events

SAP HANA 2.0 supports LDAP!

via SAP HANA 2.0 supports LDAP!

One of the great new features available in SAP HANA 2.0 SPS0 is its support of LDAP authorization. SAP also takes that a step further in SAP HANA 2.0 SPS3 by also adding support for LDAP authentication with automated user provisioning. With this in mind, one could now state that SAP HANA supports LDAP in the enterprise environment. However, closer inspection of the evolution of its LDAP capabilities in required. Because SAP HANA’s LDAP support evolved from authorization to authentication and provisioning in later versions, the setup can get a little confusing. In many ways the components of authorization and authentication can each operate independently. However, using both together is the most practical approach. With that in mind, lets look at authorization, authentication and user provisioning each in more detail. I will also conclude with an example setup using SQL commands.

Categories: HANA

Protiviti Authors Set to Launch New Book, Data Provisioning for SAP HANA®

Five of Protiviti’s Data Management and Advanced Analytics practice SAP experts have come together to write a 375-page guide, Data Provisioning for SAP HANA. Before making data available in SAP HANA, the data must be standardized, integrated and secured. This book details the options to accomplish that data provisioning, introducing readers to the various tools available and providing detailed case studies that demonstrate those tools in action.

SAP provides several options to extract, transform and load (ETL) data into SAP HANA. Data Provisioning for SAP HANA looks at each tool independently to understand the strengths and weaknesses of each, and where each typically sits in the IT landscape. Those tools include:

  • SAP HANA smart data integration (SDI)
  • SAP HANA smart data quality (SDQ)
  • SAP HANA smart data access (SDA)
  • SAP agile data preparation
  • SAP data services
  • SAP landscape transformation replication (SLT)
  • SAP data quality management (DQM)
  • SAP HANA data in the cloud

Each chapter demonstrates how to install, configure and develop in these tools and case studies show how these tools were implemented in a number of organizations.

This book will be helpful for decision makers who want to understand the different options available to load data into HANA for new implementations, or for companies looking to simplify the IT landscape by utilizing some of the new tools to ETL data into HANA. Architects, developers and system administrators will also appreciate the book’s step-by-step instructions for configuration, development and administration of these data provisioning tools.

Congratulations to our authors: Don Loden, Managing Director; Managers Russ Lamb and Vinay Suneja; Senior Consultant Vernon Gomes and Consultant Megan Cundiff.

Data Provisioning for SAP HANA is expected to be available in June in hardcover and ebook format. Pre-orders are being accepted here:

Categories: HANA, Industry Trends

Demystifying SAP® HANA: Understanding Options, Determining the Best Path

Although SAP HANA products have been around for some time, they continue to evolve, and we continue to find that many clients remain unsure of how to best unlock the potential of HANA solutions within their organizations. Most know that HANA is SAP’s high-powered, in-memory column and row store database. Yet many overlook the fact that SAP HANA is much more than a really quick database. It offers a multitude of functionality, including a database which also serves as the db foundation for many of SAP’s Netweaver based solutions, a development platform and a data warehouse solution. In addition, SAP HANA can function as the foundation for the BI platform, providing users the ability to model business scenarios in real-time, adding tremendous value. Yet, not all versions of SAP HANA are equal.

At the recent 2018 BI/HANA conference, Protiviti Managing Director (and co-author of this post) Don Loden presented a breakout session on understanding and demystifying the SAP HANA options available. During that session, Loden discussed real-world use cases and used product demonstrations to help attendees understand when and where HANA options make sense and the impacts these solutions have on complex global organizations.

During the conference session, Loden outlined the most common options for SAP HANA. First is SAP Suite on HANA or SAP S/4HANA.  With both, HANA becomes the data platform that runs business suite applications. But in a true S/4HANA world users see another lift in performance because S/4 has been purposely designed for HANA.

Click here to read more.

Don Loden






Russ Cohen





Categories: HANA

Developing a High Performing Data Management Organization

These days, the hot analogy in the analytics industry is that “data is the new oil.” Like oil, data must be found, extracted, refined and distributed. More companies are investing heavily in cutting-edge technologies like machine learning, artificial intelligence and big data processing to help them harvest, groom and apply data to the correct use case for maximum value. So why, then, in the midst of this prospectors’ rush, do studies of business intelligence (BI) implementations repeatedly indicate that 60 to 85 percent of BI projects fail?

While tech is changing rapidly, the nature of most data management efforts has stagnated. Traditionally, the IT team has been seen as an all-knowing and all-capable “data priest,” producing the exact report requested by the business. We’ve seen businesses put a lot of focus on acquiring and storing data as cheaply as possible, while neglecting the equally important business use case and governance aspects. Because of this, we often see that data management organizations (DMOs) are not able to withstand the waves of change from sources such as new technology, organizational drivers and government regulations like the General Data Protection Regulation (GDPR).

Armed with that historical knowledge, I want to offer a few considerations for organizations to take into account when analyzing their DMOs.

Click here to read the full blog post.

Don Loden

Three Fundamentals for Building a Solid Data Governance Program

Time and again, we talk with clients who are neglecting perhaps the most important feature in a solid data strategy: data governance. With the explosion of data resulting from an increasing adoption of digital initiatives and the undeniable fact that we are now living in a data-driven world, it is more important than ever for organizations to recognize the importance of protecting data as a key asset. From regulatory challenges in the U.S. driving a need for better data governance programs and a trend in hiring chief data officers to the imminent General Data Protection Regulation (GDPR) in the European Union, the pressure is growing on organizations across all industries to recognize the need for better maturity in managing and governing data assets.

Data governance as a practice has been around for some time, but many organizations continue to struggle to incorporate basic data governance processes into their overarching data strategies. Those who fail do not always do so from a lack of effort. Where to start and how to build a data governance plan is still a significant issue for most companies, and we have seen many firms have multiple false starts before they are able to gain the needed traction.

During a recent webinar we hosted, we asked the audience – primarily IT, audit, finance, and risk and compliance professionals ­– to weigh in on how well their organizations are doing with data governance. A full 39 percent of this group told us they have no idea whether their data governance programs are effective. Even more startling, just short of 20 percent admitted their enterprise has no data governance program in place.

These numbers may appear surprising, but they are typical of what we see across all industries – although certain groups, such as financial services, do have a higher maturity when it comes to data governance due to specific regulatory and compliance requirements that include anti-money laundering (AML) and Dodd-Frank regulations, and the fact that many banks have a global presence, making them subject to GDPR for their EU clients. Many organizations recognize the need for strong governance but often find it takes years to work through the complexities involved in establishing workable governance functions.

We understand the situation. We also know there is a way for organizations to build an outstanding data governance program that fits their needs, without the frustration. Here are just three tips to help get a data governance program started:

  1. Begin with an assessment of the organization’s current state. At Protiviti, we leverage multiple assessment models, including the Enterprise Data Management (EDM) Council’s Data Management Capability Assessment Model (DCAM) for financial services companies, and the Data Management Association (DAMA) International’s Guide to the Data Management Body of Knowledge (DMBOK®) across other industries. The DCAM framework includes eight core components ranging from data management strategy, data and technology architecture, and data quality to the rules of engagement for data governance programs. Whatever the model used, it should be matched to the organization’s needs and not applied generically.
  2. Establish a pragmatic operating model. Data governance programs must combine functional expertise, industry knowledge and technology in a well-organized and coordinated way that is planned, holistic, actionable, simple and efficient. We call that our PHASE approach, and it sets a solid foundation for future data governance by bringing together these three key components and identifying tactical steps to execute and operationalize data governance.
  3. Have simple guiding principles. We recommend that organizations:
    • Establish clear goals and purpose
    • Only put governance where needed
    • Keep the plan simple
    • Design from the top down, but implement from the bottom up
    • Be flexible
    • Communicate, communicate, communicate.

One of the most critical success factors in establishing a data governance program is to identify the value it will deliver to the organization. There is a risk this focus on value may get lost in compliance situations, where meeting a specific requirement is unquestionably the goal. Therefore, it is important for organizations to also ask: What real business problem are we addressing through our governance strategy? How will the organization be better off tomorrow than today as a result of our governance work?  What are our data problems costing us – both in opportunity costs (not being able to pursue something) as well as real monetary costs?  And how can we do all of this with a smaller spend, showing quick value?

As chief data officers join the executive suite in increasing numbers, the importance of maturing data governance is confirmed. Ensuring that the data governance team has a seat at the table for all major business decisions and key projects – both business and technology – is proving to be a best practice and a critical success factor for the future of the organization’s data strategy. Data governance is a process, not a project. By making it a core competency, organizations will be ready to take on the data-driven future.

Matt McGivern







Josh Hewitt



Categories: Data Governance

What’s New in SAP S/4 HANA Implementations? A Report from GRC 2018

Note: Several of our colleagues from Protiviti’s Technology Consulting practice attended the SAPInsider 2018 GRC and Financials  Conferences. Their blogs on SAP-related topics are shared here. Mithilesh Kotwal, Director, discusses the importance of proactively addressing implementation risks during S/4HANA migrations.

Ronan O’Shea, our ERP Solutions global lead, delivered an insightful session reviewing the different responsibilities of the business during a system implementation. As he pointed out, systems must be designed from the outset to support the business. Organizations cannot expect system integrators (SI) to develop these designs alone, as SI are technical experts – not business process experts. This is why the business should be responsible for defining the vision and operational expectations for the future state of each business process that the new system will impact.

During his session, Ronan shared key system Implementation Statistics, including:

  • 74.1% of ERP projects exceed budget
  • 40% report major operational disruptions after go-live

What do you do to ensure your implementation does not become a part of statistics like these?

The role that the business plays in an ERP system implementation is at least as critical as those played by IT and the system integrator (SI). The business owns the top four risks on an ERP implementation:

  • Program Governance
    • Misconception: The SI will manage the governance of the entire ERP implementation.
    • The truth: Typically, it is beyond the scope of the SI to provide the level of management needed to oversee the implementation end-to-end.
    • What should companies do? Establish a comprehensive PMO structure that manages the program beyond just the SI deliverables i.e. it includes things like:
      • Oversight of business and IT resources
      • Management of other vendors
      • Open engagement with company leadership on the risks and issues within the program
      • Unrelenting commitment to the transformation goals of the program.

These implementations are complex and have impact across many functions; the incentives of different parties must be checked and balanced.

  • Business Process Design
    • Misconception: The SI will guide us to adopt leading design practices baked into the software.
    • The truth: The requirements and design of the future solution emerges over time (if at all), leading to rework, changes, delays and missed user expectations both pre- and post-go-live. The SI is primarily a technical expert and not a business process expert.
    • What should companies do? The business retains the responsibility to define the vision for what to expect operationally of the new system with regard to each business process. This vision can take the form of:
      • Future-state end-to-end process flows that outline the automation level expected
      • Governing business rules (e.g., customer price calculations, cost allocations, tax computations)
      • Data requirements and event triggers for integrations to other systems
      • Controls and contingency or exception workflows
      • Who takes action

Take your time to define this vision so that you have a baseline against which to evaluate the technical solution delivered by the SI and make sure you are meeting your transformation objectives. Assess process owners’ awareness and understanding of key design decisions’ expected outcome.

  • Data Conversion
    • Misconception: Data conversion is a technical task with no business involvement and we can just move the data from legacy to the current system.
    • The truth: Companies leave this activity till too late, without any business involvement resulting in incorrect mapping of data and poor data quality that cause delays in implementation and impact operational effectiveness of the new system.
    • What should companies do?
      • Review the plans and design for the overall information strategy, data governance and data conversions and ability to ensure complete and accurate data will be available at Go-Live
      • Perform project-specific quality assurance procedures
      • Provide recommendations for longer-term initiatives to maintain data quality

Data is key, the business should treat data conversion design and data cleansing as a top priority work stream and take operational and audit considerations into account. The business must establish strong data governance that extends beyond successful rollout of the new system.

  • Organizational Change
    • Misconception: Organizational change is training, right?
    • The truth: Users and business process owners are unprepared to participate effectively on the project, business requirements, design, testing, training and adoption. Lack of focus on building user and management support, adoption, and readiness leads to ineffective and inefficient processes, and post-Go-Live disruptions, regardless of the quality of the system implemented.
    • What should companies do?
      • Examine user adoption / enablement plans for the system and processes, including ongoing user support and training processes, process organization change, and process performance measurement.
      • The business must plan to develop policies and procedure and define new roles and responsibilities as well as delivering practical training.

Prepare the organization well for the transformation project you are undertaking. Engage the users frequently to prepare them for the change to increase user adoption.

These four key risk areas, in addition to other risk areas, are explored in detail in this white paper.

Mithilesh Kotwal, Director
Technology Consulting

Categories: S/4HANA

ICYMI: Protiviti’s Brian Jordan Talks Data Mining

In case you missed it, click here to listen to a recent episode of the “Coffee Break with Game Changers” radio show, presented by SAP.

In this episode, Protiviti Managing Director Brian Jordan joined Marc Kinast from Celonis and SAP’s John Santic to discuss “Digital Footprints: Mining the Data in Your Operations.” Tune in to learn why Brian’s favorite movie quote is from Clint Eastwood: A man’s got to know his limitations.”

You’ll also learn why process mining is one of the hot trends in business intelligence today.

  Brian Jordan