Provocative Business and Data Consulting Solutions

Using Mad-Libs to Fuel Your Data Capabilities

Madlibs for data capabilitiesThere is little question that the availability of new and extensive amounts of data is changing our world. Nearly everyone has a smartphone, which acts as a sensor, collecting new data about nearly every action we perform. As processes become increasingly digital, new data can be collected about users’ behaviors. Furthermore, the Internet of Things is making use of sensors on our bodies, cars, appliances, homes, equipment, planet, and nearly everything else. This rush of new data will accelerate change and force insurance companies of all kinds to adapt.

Alan Kay said in 1971, “the best way to predict the future is to invent it.” The big question for insurers is, “how can I invent it?” This is especially challenging given the pace of change insurers are facing. They are asking, “How can I find the right data to change processes, broaden service capabilities, increase sales, and bring more value to my customers?” The following simple brainstorming technique will help insurers explore the possibilities related to data that they may or may not have today. If you don’t have the data today, you may need to find a source that can deliver it, or look for data that is only now beginning to become available.

The key to this brainstorming exercise is the question “What would I do if I knew?” This will help you broaden your thinking and move past your current constraints. You may recall the Madlibs you completed as a child. This exercise is a little like a Madlib. Replace the “I” with a department or group. So it reads, “what would [claims] do if it knew [ ______ ] ?” The first bracket could contain any department or group. So for example, you could fill in sales, service, training, agents, marketing, underwriting, and on and on.

In the second bracket after the word “knew,” fill in the data that you already have, or wish you could get access to. Have a list of data you already have, and a list of data you could likely obtain to begin to fill in the second bracket. Using the same example, “what would [marketing] do if it knew [the income bracket of the beneficiary]?” Once that is filled in, explore what possibilities it opens up. In this example, outcomes might be the following:

  • We could send targeted offers about reinvesting the claims money rather than simply sending a check.
  • We could have the check delivered by an agent (even for orphan policies) for wealthier beneficiaries to build a relationship rather than mailing it.
  • We could prioritize our engagement efforts and offer additional information, services, or other value to the beneficiary to establish a relationship with those beneficiaries who would make ideal customers.

Some additional examples could include:

  • What would sales do if they knew which agents were not illustrating our products?
    Possibility: We could send out additional training or offer to do illustrations for them.
  • What would service do if they knew how much their customers used mobile devices to interact with their life/annuity insurer?
    Possibility: We would know what services would provide the most value to high-value consumers. What would sales do if they knew the buying triggers of prospects?
    Possibility: We might send them targeted offers that align to the life stage of the consumer.

As the availability and use of data continues to expand, insurance companies will need to find ways to successfully leverage that data to their advantage. This “what would you do if you knew” brainstorming technique can help insurers open up their thinking about what is possible, and how their customers can be better served.

We would love to hear how this went with your teams. Did you generate some new ideas? Did you wish you had data that you don’t currently have? Do you have a list of great ideas that you have no capacity to implement? Let us know in the comment section below.

Transform your agents world with data

To achieve data objectives, insurers should apply the three types of data (diagnostic, decision support, and alerts) to improve their services to their distribution partners and agents. Read the full whitepaper

Big Data Opportunity or Big Risk?

How to Prepare Your Organization to Manage Risk and Capitalize on Opportunity

Big data risk or opportunityMany companies are accelerating their efforts to build analytics capabilities that will position them to realize the power of big data. The problem is, many companies exhibit key flaws in their core processes for managing and monitoring data quality in their current data assets, which can lead to costly data cleanup efforts and even compliance violations as companies look to scale their data acquisition and utilization. In this article we will cover 3 common fixes that most companies can implement to protect their data quality and prepare them to enjoy the tremendous benefits of big data before it becomes a big problem.

The first step in positioning your company to leverage big data is to first clean up what you already have. Insurance companies have reams of data rich with intelligence but much of the existing value is lost through data redundancies and poorly captured data across business and core processes. Companies should focus on cleaning and integrating existing data source and piloting analytics capabilities. This is how you start small, start fast, and capture return quickly. Chances are you’ll discover a few extra kinks you’ll be glad you found before making the leap into larger initiatives.

Second, look to begin standardizing the business rules for how data is captured and managed to avoid mutations or duplications in the data you already have. A common flaw seen across industries is a lack of standard and centralized business rules and data dictionaries that govern how data is captured and maintained. This lack of standardization will continue to leak data quality issues into your organization and undermine expensive initiatives to implement data cleansing or new capabilities.

Third, you need to make sure you have the right governance management framework in place to drive data-centric change and monitor compliance with policies and rules. At a minimum, a hierarchical structure should exist similar to a project structure. There should be a set of responsibilities for an executive steering committee to monitor issue management, return on investment, and overall company compliance with internal controls and external regulation. A middle tier, usually composed of Senior Managers, VP’s or high level Directors, is responsible for monthly planning and monitoring of data issue resolution. They are, effectively, the day to day managers of data-centric change and should meet regularly to make decisions, remove barriers, and ensure the appropriate business stakeholders are ‘in-the-know’. This group should include a rotating spectrum of representatives (depending on the agenda) from functions across the business and not be managed in a silo by a single data or technology team. The remaining tier is the execution level. The day-to-day executioners of tasks and projects put in place to achieve governance maturity. This layer consists of data stewards and other data SMEs from across the organization. Often times, existing meetings and platforms already exist so implementing a few small changes and getting these groups integrated is very practical.

Leveraging Big Data certainly has inherent possibilities, but ignoring internal realities can definitely turn the great data opportunity into a costly nightmare. Achieving an increased state of data management maturity and realizing returns can be more easily achieved by focusing on leveraging existing assets and human resources as opposed to investing in new data technologies and capabilities. Companies have a shared opportunity and responsibility for standardizing the management of their data, improving data quality, and implementing appropriate governance structures that tie it all together.

Take the risk out of Big Data

Securing your data architecture is a continuous process but the benefits will keep you “out of the line of fire”.

The Business Value Linkage: Securing Buy-In for Your Data Governance Projects

Buy-in for data governance projectsOver the past decade, the need for data quality and better governance has evolved from a struggling sales pitch to an integral must-have by executives across company functions and industries. Yet, many CIOs and other enterprise information management (EIM) leaders are finding it hard to secure the buy-in needed to continue maturing data management practices and reducing the many risks that come from ‘dirty data’.

Executives most often fail to get the buy-in they’re looking for because the discreet outcomes, or project deliverables, do not have a clear connection to business objectives. In other words, can you isolate a key deliverable from your project and articulate to the Senior Leadership Team how you are going to move the needle on one of their most pressing measures for business success with it? In most cases, this connection is not clear. Providing evidence for this connection is essential in order for executives to gain resources and secure the project funding they need, even when competing with newer business priorities, to deliver a new product, service or enhancement with maximum impact at no unnecessary cost or risk.

An effective way to track your efforts to create and maintain value linage to your project objectives and deliverables is by using a Business Value Linkage dashboard. Think of it as a business case on steroids, a measurement tool, and the basis for status reporting. Keep in mind, each of these steps can be completed by the project team, but should be vetted with key executives who will be impacted so everyone is aligned on scope and ‘what they’re getting’ from Day 1. This is crucial to buy-in. Here’s a quick reference guide on how you can get started at building your value linkage dashboard.

  1. Define the project mission. That is, very clearly define, “what is the particular initiative or project going to deliver and how does it support the broader business objectives?”
  2. Define your objectives. You need to answer the question, “How do we define success for the project?” Best practice is to develop 3-5 objectives, depending on the size of the project or program, and include questions like, “will your project automate processes and reduce the likelihood of human error?” Or, “will it organize data more effectively in some way?” Make sure these objectives tie to your mission and broader business objectives.
  3. Identify specific business indicators. Your last step will be to ensure your dashboard is supported with the organizational metrics that will help you answer the question of whether you are successful. If they do not exist today, you know you have a little more work. Indicators answer the question, “How will we track and report success for the project?” Here’s a practical tip. You likely already have a starting point if you have an existing data governance or quality dashboard. This step is something that is typically overlooked in projects, but it’s necessary to continue telling the story, demonstrating value (or how a program may be deviating) and sustaining buy-in for the change. Don’t skip it!
    Here’s a conceptual view of what you’ll end up with:Link Data Management to Business

The Business Value Linkage methodology is a powerful tool for isolating scope, gaining buy-in, securing funding, and articulating progress in terms that the ‘business’ can understand. By implementing a few fundamental shifts in how you propose your data governance initiatives, you will be well on your way to winning the executive support you need across the organization to implement better data governance.

Further your knowledge on linking to business value

Read Linking Business Value to System Initiatives and watch a short Business Value Webcast.

A Secure Data Architecture Will Keep You “Out of the Line of Fire”

Data arch out of line of fireMany organizations have built the foundation of their security architecture based on physical and logical technologies designed to keep “bad actors,” or threats, from entering their environment. This approach is not without merit, as it mirrors what we have traditionally done with our homes. We install locks, alarms, and outside cameras, all with the notion of providing a strong perimeter of keeping the “bad guys” out. This is supplemented by strategically organizing things in our personal lives to minimize losing everything if we encounter a natural disaster or if someone does enter our homes, for example having a safety deposit box at the bank.

But, while you can protect your home from myriad threats from the outside-in, there are still issues that can occur from inside your four walls that the best protection and even inside surveillance (e.g. internal cameras) won’t protect you from. Consider your kids, heaven forbid, playing with matches, or forgetting to unplug the tree on Christmas Eve, or your so-called best friend using your computer to look-up movie times and stumbling upon your banking records. Whether at work or at home, your data is only as secure as the architecture that it has been housed in. Once a “bad actor” gets behind your defenses, whether from the outside-in or inside-out, your architecture and the security principals upon which it’s built will protect your precious assets. So where to start?

  1. Assess and Prioritize Your Risk Surface – Assuming your organization has an existing data architecture in place, you’ll want to build upon your current investment by reinforcing layers of security upon it in a thoughtful and prioritized manner. You can’t afford to boil the ocean, and quite frankly, it doesn’t work anyway! Instead, sit back and think about the overall risks that face your organization and then prioritize those that are most likely to occur and have the greatest impact to the organization. For each risk you need to ask yourself:
    • Do I have the right architecture policies in place to protect my data sets and the systems that impact them?
    • Do I automatically have access to the right internal and external information so that I can perform predicative and prescriptive analytics when making architecture decisions and assessing my security posture?
    • Do I have the right processes in place to ensure my architecture is resilient from inside-out and outside-in vulnerabilities with minimal manual intervention?
  2. Enhance Your Data Architecture – Once you have performed a thorough analysis of your current risk profile and found the greatest bang for the buck, you need to implement an integrated strategy for strengthening your data security architecture. It’s not going to be sufficient to tactically address items a la carte. That’s not how robust architectures are developed. You can of course prioritize your efforts to dovetail with organizational strategies and budget cycles, but the overall roadmap must be holistic in nature. These security dimensions must work seamlessly together in an orchestrated, adaptive manner.You need to harden your systems from the inside-out first AND isolate platforms that are mission critical to your organization from the rest of your systems. This of course, goes hand-and-hand with reinforcing your perimeter security and firewalls. You also need to logically structure and segment your data and the access to it so that people only gain access based on the “need to know” principle. Even then, when they do access it, you must be sure it’s the actual person who they claim to be. This must be supported by protections to keep all data private and confidential so that prying eyes are unable to peer into your data should they somehow gain access to it from either inside or outside your company. And remember, this means data that is just stored at rest in your databases, or traveling within the confines of your network, or over an external network. It’s all “fair game,” and as we see every week in the news, no data is truly safe.
  3. Prepare for the Worst – Most people maintain some type of insurance, whether it be homeowners or life insurance, because they either think it’s a prudent thing to do (stuff does happen!), or because the law says they have to (e.g. automobile insurance). Data security risk mitigation requires the same level of vigilance and balance. It all starts with an appropriate Monitoring, Intelligence Gathering and Threat Assessment capability tied to the Risk Assessment discussed above. This continuous process must be a combination of automation, human intelligence (internal/external), third-party expertise, scenario assessment and testing, and most of all, imagination and planning. We stress the final point because the sophistication of threats grows each day, and no one person or organization is able to solve all problems by themselves. It does require partnerships.
  4. Detect and Respond – At some point many, if not all, organizations find themselves the target of some type of incident. Now many are of a small nature. It may be something as simple as a user accidentally deleting important data – which is still a potential critical impact to a firm. Or, as we have seen, it can be a rogue nation state attacking a major corporation or government. While the details may be different, and the amount of press attention may vary, the impact and disruption to those involved often feels very much the same. Your organization’s ability to proactively detect and respond when a potential incident may have occurred, to validate that it is in fact an incident, and to isolate and remediate can literally make the difference between the short- and long-term viability of your firm. Keep your company “Out of the Line of Fire.”

Securing your data architecture is a continuous process. It requires a holistic approach that necessitates looking at the strategic drivers of your organization and the systems and technologies required to support it, along with the changing competitive and threat landscapes that evolve over time. As each of these evolve, you will find that the robustness of your planning, resilience of your architecture, and overall responsiveness will prove to be invaluable assets to organizational viability and growth.

See how NEOS has helped their clients with Data Architecture projects

Architecture Health Assessment Case Study NEOS

Centralized Reporting Architecture NEOS Case Study

NEOS Sponsors North Texas Chapter of the Data Management Organization

Sponsorship seeks to boost the capabilities of the start-up chapterNTDAMA-Logo

Hartford, CT – February 23, 2016 – NEOS, a management consulting firm that helps its clients solve challenges related to data, process and technology, today announced it is a leading sponsor of the North Texas Data Management Association (DAMA).

NEOS understands the importance of a sustainable professional organization that promotes best practices in data and information management, and has committed to supporting the start-up chapter in order to realize this vision. NEOS and North Texas DAMA seek to develop a well informed and motivated membership base to promote the exchange of information, trends and best practices among information management leaders.

Kevin Ladwig, managing consultant at NEOS and member of the North Texas DAMA Board of Directors, says of the affiliation, “I worked to restart the North Texas DAMA Chapter because I recognized that there was a need for information professionals in North Texas to share ideas regarding Enterprise Information Management (EIM). NEOS excels in the area of EIM and is, thereby, a natural sponsor for promoting and imparting a pragmatic approach to data management.”

To learn more about NEOS, visit www.neosllc.com.  For information about North Texas DAMA, visit http://www.northtexasdama.org/.

About North Texas DAMA

North Texas DAMA promotes the understanding, development and practice of managing data and information as key enterprise assets to support the organization. North Texas DAMA is a not-for-profit, vendor-independent, association of technical and business professionals dedicated to advancing the concepts and practices of information and data management. North Texas DAMA is an essential resource to those who engage in information and data management.

About NEOS

NEOS is a management consulting and technology services firm specializing in the global insurance and financial services industries with deep experience in holistic modernization, enterprise data and business operations consulting. Clients range from large multi-line companies to more specialized providers. Solutions encompass legacy product and closed block management, operational and IT risk, and business-technology strategy. Services include process, organizational and operational consulting, enterprise architecture strategies and design, deployment and data analysis.

The Association of Metadata with Data Governance

MetadataHistorically, we hear metadata defined as “data about data.” I prefer to describe metadata as data about data that defines and describes other data within a given context or set of circumstances, for particular purposes, and with specific viewpoints. All metadata is data but that does not mean all data is metadata. In other words, metadata only exists to describe the characteristics surrounding a particular data construct. With enough characteristics stored in a database (metadata repository) or data governance tool, such as Collibra, the Data Governance organization is able to formulate the appropriate level of conceptual models that are meant for both business and IT to read, understand, and use to make quantifiable data management decisions.

For Data Governance to succeed at managing data assets and related resources, it must rely on metadata. Metadata is used to describe matters such as who does what with the data, who needs the data, who produces the data, and even who is accountable for the data (i.e. data steward). Frankly, we are collecting metadata all the time. We have it in our data models, classification schemes (data protection), databases, glossary, issues log, process flows, system maps, business rules, quality rules, privacy rules, stewards, reports, ETL processes, data lineage (lifecycle), and even on our organizational charts. Yes, metadata is simply about collecting information about data, activities, and people.

As you would expect, metadata does not govern itself. It too is data that needs to be governed using the same activities the organization uses for non-metadata.

Metadata is captured based on information asset requirements and is described and used in various models and business intelligence reports. Traversing the evolutionary lifecycle of metadata is relatively straightforward. You define what the business needs to know, design and build related models and reports, and create and distribute informational assets (models and reports) to appropriate parties. We use a three-step model to convey this concept for traversing metadata through its evolutionary lifecycle through definition, generation, and consumption.

 

Meta data blog image2

The Definition Stage is where Data Governance establishes the requirements (what to collect) and standards for the creation of conceptual models. These requirements are nothing more than questions that need to be answered through metadata and solidifies what metadata is required and what is the meaning of that metadata across the enterprise. Some example questions metadata helps to answer are:
• What customer or product data is linked to what system, and what processes use that system?
• Who is accountable for that system?
• Who are the users of that system and where are they located?
• What sales channels are linked to that system?
• Where is all of the PII, PCI, and other sensitive data?
• What is the meaning of “customer” across all lines of business?
• What is the most reliable data source?
• Who are the data stewards for that master data domain?
• Who has access to create, read, update, and delete authority across data-related resources?
• Where does the data reside and how does it flow through the system?

The Generation Stage is where the Data Governance organization produces conceptual models and reports to answer questions derived from the Definition Stage. These models can simply be spreadsheets in the short term. However, it’s recommended that the Data Governance organization depend on more sophisticated models that conceptualize the relationships of a particular data resource or data element across all business and IT viewpoints. This will save on the potential pain caused to users when copies of metadata are in multiple spreadsheets and one of the metadata elements has a deviation that requires research, thereby breaking the level of trust required for a successful Data Governance program. In this stage it’s important that all participants agree on where the source of metadata shall reside, how it should be accessed, and when it‘s appropriate to be collected from any specific data management process (e.g., data quality management, master data management, data architecture, and so on).

The Consumption Stage is probably the most important. It’s responsible for providing the right conceptual models to the right users at the right time to make the right decisions. For example, users receive the conceptual models before any project work begins so that they can perform an Impact Assessment on any potential changes, thus allowing both business and IT to locate the impacted data element, system, data steward, and so on. Metadata definitions unlock the value of data, turning enterprise information into assets.

Metadata is absolutely needed to keep data governance running smoothly. You will know you have the correct level of metadata when your quality dashboard reflects an improvement in data quality, users understand the conceptual models and use them for their designed purposes, data is being protected and you can prove it, and, last but not least, analytical reporting capabilities are indeed more reliable.

Since all metadata is really data, successful data governance mandates that companies govern metadata itself using the same established data governance principles. In that sense, metadata issues are tracked and metadata repositories are kept up-to-date as part of everyday duties, which includes recording metadata updates related to both issues and projects. Not all companies seem to grasp that data governance necessitates that metadata updates happen as part of typical issue resolution and project life cycles. A conceptual model reflecting all the projects in flight or planned that will impact a particular data element is a nice model to mandate, especially when you are concerned about data protection and maintaining BCBS 239 compliance.

Where does one start? Take inventory of data-related assets such as existing systems, critical processes, users, stewards, data sets and so on. In conjunction, flush out the business glossary and work with IT to define the data dictionary, which probably has many technical permutations of the business terms defined in the glossary. Along the way, the Data Governance team must set time aside to build an inventory of questions that both the business and IT agree are worth answering on a specific cadence. At the end of the day, the enterprise will start to feel as if it’s speaking with the same common vocabulary, that it’s able to better manage data risks and data challenges, and that it can trust financial and operational reports.[/vc_column_text][/vc_column][/vc_row]

Want to continue reading about Data Management?

Make sure you didn’t miss our latest articles on Data Value Management and Securing your data so you can focus on growth.

Data Value Management: The Unsung Hero of Data Governance

Data Value ManagementThe enormous enterprises of the 21st century have many years of accumulated data resting across multiple disparate ERP systems, legacy systems, and data environments. As a result, it’s making data management very complicated and the prioritization of data-related work an exceptionally daunting undertaking for Data Governance organizations. In order to elevate this data management dilemma, successful Data Governance organizations must implement a value-based data management process that attempts to objectively quantify the data’s relevancy to the business.

To further validate that the business relevancy of data is still one of the most misunderstood data problems that organizations face, I asked my college-bound, millennial-aged son what came to mind when I say, “What does data value management mean to you?” His response, in a matter of fact tone, was, “Its value of data and how to organize it. It needs to be organized to tell a story on what to do first based on what is the most important, what you should know first, and even what is irrelevant. You should be able to organize data from most important down to unimportant, and yet it still be relevant.” Okay, I guess it is all about data “relevancy” after all. So, based on that profound word of wisdom from a young man a fraction of my age, the challenge that presented itself was, how does one measure relevancy? There has to be a model or set of models that can help large enterprises manage data relevancy.

After some inner reflection on past value management endeavors, and a peak at what Gartner has to say, I can confidently state that if an organization evaluates its data as a whole using the following six value streams, the data’s underlying value to the organization, and relevancy, will be determined. The six value streams are:

  • Cost Value Model: This model assesses the financial impact to a company when data is lost, stolen, or corrupted.
  • Economic Value Model: This model assesses the degree to which data actually contributes to generating income for the company.
  • Internal Value Model: This model expresses the level of privacy and uniqueness of the data, as it relates to the success and enablement of the enterprise.
  • Market Value Model: This model is used to address the perceived monetized value of the data for any given marketplace.
  • Performance Value Model: This model aligns itself with data used to help companies clarify and monitor their business drivers (e.g., technological innovation, analytical reporting, superior products, excellent service, and ongoing customer support).
  • Quality Value Model: This model concerns itself with scoring the quality of data required to ensure data is fit-for-purpose across the enterprise.

Even when determining the overall score across these six data value models, or value streams, there is still a bit of guess work involved in the scoring process. Determining the data’s value or relevancy is not an exact science. However, if you consider some of the definitions implied by the models, you will have a better grasp of how to apply data value management across your Data Governance organization.

Understanding the relevancy of any given data set provides important feedback to the business on what data is extremely important and what data should clearly have the most attention or resources available to manage it through its lifecycle. Once an enterprise begins to focus on speaking about data in terms of value, they tend to see the following characteristics manifest themselves across the enterprise:

  • Improved regulatory adherence and compliance reporting, especially surrounding BCBS 239.
  • Improved data quality processes.
  • Improved data management capabilities.
  • Improved performance reporting across business areas that drive performance.
  • Mindset of performing the right best practice for the right data at the right time for the right expenditure of time and money.
  • Self-sustaining approach to data value management through data value model learnings.

Without the knowledge of the data’s business relevance, the enterprise is most likely investing energy and money on governing the least important data, and is at the risk of losing its competitive edge. So, in order to get an accurate representation of the relevancy or value of the enterprise’s data, implement the value models one or two at a time, and as you learn more about your data assets, adjust accordingly. This will help your Data Governance organization develop a fully functioning data value management process with the capability to determine and track the appropriate value for any given data asset.[/vc_column_text][/vc_column][/vc_row]

Keep reading on how to get value from your data

NEOS’s data services and solutions can help you get the most from your data.

Secure Your Data so You Can Focus on Growth

Security DataEach morning you start your day and think about the growth opportunities that lie ahead. Some of these are strategic endeavors and others are tactical in nature. Yet, one thing they all have in common is that they are grounded in the assumption that you and your team will have access to the data necessary to make the critical decisions that drive your business. But will you?

Have you and your organization considered the external and internal threats that put the data of your firm at risk? Did you know that data in motion, that is data that is being manipulated or transformed, is subject to different types of security threats than data that’s at rest or simply being stored? Are you protected against these various types of interruptions to your businesses growth opportunities?

The first quarter is the perfect time to assess vulnerabilities that put your data at risk. This is the time to develop and execute the mitigation strategies, processes, controls and technologies necessary to protect them. Do you have the necessary identity management, entitlements and access controls in place to ensure that only those with “the need to know” have access to your data? Can you actually recover the business critical data necessary to run your business in the time required by the regulations that your organization is held to comply? Is the data used to test your applications secure and confidential? If you answered yes to all of these questions (honestly now), then you are far more mature than most firms are.

In many cases, we at NEOS have found that organizations are faced with huge data security vulnerabilities that are the result of a series of business risks that are not intuitively seen as related, yet come together to create a huge business impact. These risks are often subtle in nature and are the result of unrelated situations such as the need for cases where your organization faces a deadline to comply with a new regulation and the call goes out for “all-hands on deck”! Of course, you have controls in place to protect against defects being released into production, but what about the case where a copy of your production data is being used in your QA environment to test this critically important set of changes. While the firm is focused on complying with the regulatory deadline and addressing their business as usual, there is a hiccup with the replication between the efforts to work on the “all-hands on deck” effort and your normal business as usual effort. The “QA data” (which is actually a copy of real production data) is sent out to your production environment. Disaster! This is a data security breach from the inside out and it wasn’t even caused by a third-party. That’s where a trusted data security expert like NEOS can help.

Take this time to ensure that your company’s data is protected. This is particularly important for confidential and material non-public information (MNPI) in highly regulated environments. The investment you make now, will reduce the risk surface to your organization and the stakeholders that your serve. NEOS can help you to ensure that you are using the appropriate tools and techniques including data masking and encryption. We can also help ensure that you have a robust access control framework in place to ensure that only those with the need to know, have access to your second most important asset – your data (your first being your people of course).

All it takes is “The Perfect Storm” of third-party imposed deadlines, coupled with the normal over-abundance of business as usual workload and then an unanticipated data security breach, to throw an organization into disruption. By engaging experts like NEOS now, you can reduce the likelihood that these types of risks will impact your organization; and if the unfortunate should happen, you will be prepared to respond with proven mitigation strategies and responses so that your data is protected.

4 Tips to get the Most out of Your Consulting Engagement

Management consulting is a billion dollar business. How to get the most out of your consultantsDepending on which study you look at, companies in the US spend between $150bn and $200bn annually. Much of this money goes to impractical or unwieldy solutions that clients are culturally, financially, and strategically unable or unprepared to implement and paid too much to receive. Or, clients find themselves with only a partial solution and a proposal to complete the effort – which of course means spending additional money.

As people who have been on both the consulting and client sides of the table, we understand the pressures on our clients to get value for their consulting dollars. These pressures sometimes lead companies to choose the least expensive consulting option with the goal of minimizing investment over time. Many times we have been called in to re-do or rescue a project where the cheapest consulting option turned out to be an inadequate solution. On the other hand, nobody wants to end up over-paying for “shelf-ware” or receiving an over-built solution. So how can you ensure that you get what you need?

From our experiences as clients and consultants, we’ve assembled four best practices to keep in mind when reading through RFPs, conducting negotiations, or executing an engagement. If you have additional lessons learned, we invite you to share them with us in the comment section of this post.

  1. Know what you need. 
    Good consultants will ask probing questions designed to encourage you to articulate what you’d like to achieve through the engagement. Great consultants focus on both the immediate outcomes (aka deliverables) and the project’s long-range impact to your business. The more you can share with the consultants about your desired improvements to service, increased speed to market, or reduced cost, the better positioned your consultant partner is to propose and execute a project that will accomplish those goals. If you can’t articulate your objectives and how the project will align with your overall business direction, you aren’t ready to engage a consultant. Save your money and spend the time getting yourself organized.
  2.  Right size the firm you choose.
    There are thousands of consulting firms, from elite strategy to global system integrators, and one size doesn’t fit all. Consider whether you want a partner or a provider. Do you need a cadre of offshore resources versus a few highly skilled experts, and does that firm have the expertise overall to address your challenge? The smaller firm with the industry experience might be the better choice in some cases.
  3. Commit.
    It seems counter-intuitive, but commitment is essential for success. You are more likely to get the best a consulting firm has to offer if you can commit to a longer engagement. When a firm has to re-propose every three months, or if there are significant breaks between phases, the firm’s attention very naturally will shift to more consistent, longer-term clients. In addition, the best consultants are usually assigned to longer-duration engagements, and you can generally negotiate a preferred price model over a longer period of time.
  4. Stay engaged.
    Inspect what you expect. Don’t wait for the end of the engagement to review the outcomes and provide feedback. Insist on weekly or bi-weekly status meetings where you can review any in-flight deliverables, hear the highlights of discovery sessions, and direct the approach the project is taking. A good consulting firm will welcome this sort of hands-on relationship, as long as it doesn’t devolve into micro-management.

 

We’d love to hear your thoughts on how you’ve worked with consulting partners to get what you need. As we said, a good consulting firm welcomes feedback, so share your lessons learned with us below. Happy project execution!

Duct Tape Mentality for Business Process Improvement

Process is like duct tapeBusiness process improvement is like duct-tape; it’s useful for so many different things. Sure, duct tape is great for taping ducts, but it’s also good for much, much more, and a good business process improvement effort can help you do more for less. It can also help you do more than just save money. It’s important to consider additional benefits when undertaking or considering business process improvement. This blog will focus on three in particular: boosting innovation, allowing extra time/resources to be allocated to more effective work, and easing compliance risks.

A good business process helps worker bees repetitively produce flawless widgets, right? If you want innovation, then you probably need to look elsewhere, right? Wrong. I don’t mean to state the obvious by pointing out that business process improvement… well… improves business processes, but improvement can take whatever form business leaders identify, and in many cases mindless efficiency is not actually an improvement. NEOS recently undertook an effort for a national life insurer that was having difficulty innovating new products. We took process into account as part of an effort to increase product innovation and development, and the result was a process for deciding which ideas to pursue that was rich in decision points and alternative paths. This allowed for the flexibility inherent in the creative process to shine through uninhibited, while still providing guidance, structure and manageability.

When business process improvement helps you do more with less it creates another beneficial side effect; it frees up time and resources for other purposes. For example, when we were done with that business process redesign project for an annuities provider, the result was a process that required one less day from start to finish (from 4 days down to 3). Leaders were then in the enviable position of deciding what they wanted to do with their extra day. Resources were more freely available for a variety of more valuable projects, like improving advisor experience and improving their workflow system.

It’s also important to mention the benefits that this kind of effort can provide from a compliance perspective. Well-documented business processes are easier to manage, more transparent, and promote consistency in their execution and output. This is all good news from a compliance perspective, but this is only taking into account the documentation portion of business process improvement. The other half is actually improving the processes, and that half yields even more benefit. When taking the time to redesign a business process, it is often a good opportunity to put controls under a magnifying glass to see if they are adequate for their purpose. Solid controls are the first line of defense against reputational damage and regulatory penalties.

These three benefits are only examples. Like, duct tape, there are countless uses for business process improvement beyond increased innovation, extra time and resources to reinvest, and reduced compliance risk. If you have experienced any others, let us know about it in a comment below. Happy business process improving!

Gear up for Continuous Process Improvement

Some companies are failing to cash in on Continuous Process Improvement efforts, so learn from their mistakes. Read our Continuous Process Improvement blog post to make sure you steer clear of these pitfalls.


1 4 5 6 7 8 9 10 18
Subscribe to blogs via email

Enter your email address to receive new blog posts from NEOS via email.
Copyright 2014 NEOS LLC