Showing posts with label MDM. Show all posts
Showing posts with label MDM. Show all posts

Better Resource Utilization - Using Business Applications?

In an earlier post, I had outlined an idea to improve the usability of enterprise systems by creating a unified task dashboard. By having one dashboard for all activities, which could span multiple applications, users/resources can get a holistic view.

In this post, I want to extend this idea and would like to propose to the software companies/product manager’s work on expanding the capabilities of their tasks/work flows and start looking into unified resource utilization!

The first step would be to capture business process execution with accurate tasks within workflows. The second step would be to accurately estimate the time required to perform the tasks.

If and when we can track all tasks across all applications, we should be able to generate data, reports and metrics on resource utilization and be able to estimate current and future work loads accurately and be able to assign the right resources to the right problem and thus improve effectiveness and efficiency of the organization.

"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"

Tips for Successful Data Migration.

  1. Maintain your sense of humor.
  2. Expect delays and/or road blocks.
  3. Run the data migration using traditional project principles.
  4. Secure alignment and approval from steering committee and stakeholders as changes occur.
  5. Appreciate the inter-dependencies.
  6. Understand your business process, data, system and application landscape. (Devil is in the details)
  7. Get the right software tools.
  8. Use the right resources.
  9. Plan for down time.
  10. Perform at least two dry runs (Wash Rinse Repeat)
  11. Develop risk mitigation plan.
  12. Communicate your plan early and socialize with all impacted users.


"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"

Intelligent part numbers! Why and Why Not, Can PLM / ERP systems help?

All companies that make and sell products have to make this decision very early on. Some companies might not have the maturity or business processes in place when this decision needs to be made. Changes to policies done at a later time could result in additional complications so most companies chose to remain on their current policy and process.

Intelligent part numbers are used to clearly identify the type of part, its commodity or sometimes even the location of use in the overall product. Typically companies develop a matrix mapping commodity to specific sequences of part numbers. For example 12-???? Could represent sheet metal, 13-??? could represent PCBs and so on.

Unintelligent part numbers on the other hand are based on ERP/PLM system’s ability to automatically generate the next higher number.

Companies that develop intelligent part numbers can clearly distinguish between top level products and lower level assemblies easily, in addition be able to develop logic to drive procurement, review and approval cycles based on part number sequences.

Developing part number sequences can be costly, they require manual setup, customization of applications (ERP, PLM etc.) and require due diligence on the part of engineers to follow a defined process. Depending upon the rules, the business groups must pay attention to number of possible parts for a given commodity (by projecting in to the future) and also plan on adding new commodities as the need arises. Typically this could require an additional head count to manage the process and tools. In my experience, a lot of engineers would rather focus on innovation and turn out their designs and go quickly from concept to prototype to production release and not be bogged down by having to pull a new number and update their documentation and slow things down.

Unintelligent part numbers provide engineers with the ability to conceptualize their design and generate new numbers easily with minimal data in the beginning and quickly release their designs and then provide additional data. Often engineers might not know what the right commodity / material needs to be when they are working on a concept. This lack of knowledge typically results in non value added work in recreating parts with the right material and commodity if they had made a mistake. Unintelligent part numbers do have a drawback which is that it doesn’t provide any information on the part type or any other data.

As ERP & PLM systems have matured, most have introduced a classification scheme / module with which parts can be classified. Typically classification systems capture information like whether the part is OEM or not, commodity, material, assembly or not, compliant or not (for RoHS, WEE, Reach etc.), Critical part or not, in addition the description can be broken down to clearly identify the parts. For e.g., socket head cap screw could be classified into a class of screws with a sub group of socket head or not and so on.

So if we can get so granular and capture all the information we need, we could use the classification system to drive activities like procurement based on commodity, ABC coding by commodity / part class and conditional workflows for ECO cycles based on part type and whether a full review is required or not. In addition, there are other uses like knowledge management and capturing the right questions when quality issues occur based on type of part/product.

New part creation could be streamlined by checking against classification schema and existing parts to see existing parts can be re-used. This re-use has a lot of benefits. I have seen/heard of benchmarks done by a number of companies where they have found that the cost of a part through its lifecycle (concept to obsolescence) is around $3000 to $5000.

Implementing a classification system is more complex than implementing intelligent part numbers. If you chose to do this mid stream, you will need to launch a data quality / clean up program to ensure data integrity and adherence to rules of classification and then launch this activity.

In summary, there is no easy answer for the debate on intelligent vs. unintelligent part numbers. Classification systems provide a lot of merit which outweigh the effort required clean up existing data and setup a new system.

"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"

Unified Task Dashboard! Utopia?

In an earlier post, I had listed a number of emerging or new TLA's (three letter acronyms) in the enterprise application space like ERP, PLM, PDM, CRM, SCM, SRM, BPM etc...As the usage of these of applications and technologies mature within different organizations, users will soon have a set of task dashboards which outline the tasks they have been assigned within each of these applications and when it is due.

this begs the question, if we can integrate applications and have strategies like data integration / master data integration why cant we integrate the applications and create a unified task dashboard?

Most of the integrated software vendors could provide this capability but companies which have chosen best of breed applications will struggle with this unless they learn to federate and build services which can kick off / complete tasks and seamlessly integrate the applications and provide their users with one interface.

This could impact user adoption and greatly increase speed to proficiency of users and is rarely considered during software selection, planning and implementation!

"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"

Data Migration: Challenges & Joy! Part 2.

Forewarned is forearmed

Before we jump into details about migration methodologies, let us step back and understand some of challenges ahead of us. Whether you are migrating from a legacy system or a spreadsheet/database, you have to understand everything about your “SOURCE” system.

Common misconceptions about migration

• Data migration is an IT job function.
• We know our data!
• Data migration is one of the last steps taken before you go live with the new system.
• We can always change it after we go live.
• Acquiring legacy data is easy.
• Existing data will fit the new system.
• Existing data is of good quality.
• Existing data and business processes are understood.
• Documentation exists on data rules and formatting.
• We Don’t Need Tools or Special Skills
• Migration Is a Separate Activity

What you as the lead of the migration effort need to do is work with your team to dismiss these misconceptions.

Data migration is not a matter of copying data! In order to be successful at migrating data, one has to thoroughly understand
(1) Why is the data being migrated, significance and value to the organization?
(2) What data is being migrated?
(3) Where does the data reside currently?
(4) What are the rules for the data in the “Source” system and how is the target system setup?
(5) Who are the experts for each of the data domains?
(Hint: do not limit yourself to an IT resource)
(6) What is the success rate of migrating into this application?
(7) Who else in your industry segment has been through this activity?
(Hint: Do a benchmark)
(8) What do you need from a hardware/software perspective to support the data migration?
(Hint: Benchmarking and reference calls will provide this information)

Now that you armed with some answers which will highlight what you need to focus on, we can step back and think through our methodology.

Don't lose your humor, remember your mantra “I love data migration”

"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"

Data Migration: Challenges & Joy!


What is data migration?


Data migration is the process of transferring data between storage types, formats, or computer systems.

Over the last decade, I have led multiple data migration efforts and found each one of these projects challenging and enriching. I keep swearing that I will not take up another but yet I always do. In a series of posts, I am going to share my experiences so that you may benefit from my lessons learnt and insights.

Common Data Migration Scenarios: when would you have a need to migrate data and create a project around this activity?
1. Mergers and acquisitions
2. Legacy system modernization
3. Enterprise application consolidation, implementation, or upgrade, such as an SAP ERP or CRM implementation
4. Master data management implementation
5. Business process outsourcing

Why are Data Migration Projects Are Risky: If you have been assigned as the lead for data migration, be aware of the heavy odds against you! Do your research and do it well.
Based on reference documents I have researched over the years (Gartner, Standish Group Study), I have found that
1. 84 percent of data migration projects fail to meet expectations
2. 37 percent experience budget overruns
3. 67 percent are not delivered on time

Why Data Migration Projects Fail: In earlier posts, I have outlined the importance of data management and the pitfalls of bad data management. These contribute to the overall success/failure of large implementation (and its data migration). Here are some reasons that have been attributed to failures of data migration.

1. Lack of methodology
2. Unrealistic scope
3. Improper understanding and use of tools
4. Inattention to data quality
5. Lack of experience

While data migration is essential to the success of implementation of a new application or business system, its role in the project often overlooked and underestimated. The common assumption is that tools exist to extract and move the data into the target application, or that data migration is something a consulting partner will handle. Often project teams tasked with data migration focus solely on the timely conversion and movement of data between systems. But data migration is not just about moving the data into the new application; it’s about making the data work once within the new application. This means that the data in the new application must be accurate and trustworthy for business users to readily transition from their legacy applications to adopt this new application.

In upcoming posts, I will outline the methodology I have used and why I have chosen this approach. Most of my team members would fondly remember my mantras of “Wash, Rinse & Repeat” and “I love data migration”. :)

"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"

The Business Case for Data Management (MDM, Data Quality, Data Governance)!

In an earlier post, I had outlined impact of poor data management. In this post I would like to discuss the data quality and need for governance/accountability systems and technological solutions like MDM (Master Data Management).
Characteristics of good data
In order to analyze the data stored with applications and used by end to end business processes, let us review certain characteristics to check if our data is of good quality and where/what do we need to work on:
Completeness: Is all the requisite information available? Are data values missing, or in an unusable state?
Conformity: Are there expectations that data values conform to specified formats? If so, do all the values conform to those formats? Are these formats specified in the same way across all your applications (data silos)?
Consistency: Do you have conflicting information about the same object? Are values consistent across applications and business processes (data silos)? Do interdependent attributes always appropriately reflect their expected consistency?
Accuracy: Do data objects accurately represent the “real-world” values they are expected to model? Are there variations in spelling or reference information (id related to customer, supplier and employees)?
Duplication: Are there multiple instances of the same data objects within your data set?
Integrity: What data is missing important relationship linkages? Does the data adhere to a predefined set of rules?
Timeliness: Can the right people access the right data at the right time?

The fundamental nature of data is that it changes continuously making it difficult for organizations to put the data to the best possible use and achieve benefits. Furthermore, much of the data within organizations resides on different systems (for example, ERP, CRM, Order management, Customer service, PDM, PLM etc). And it is often difficult to keep all these systems in sync.

Poor data quality isn’t always apparent in processing your day-to-day business transactions. The purchasing department, for example, may not see the difference between entering “3M” or “3M Corporation” or “3MCorp.” in a database. All of these seem to get the job done. But if you dig deep, you will find that this could potentially save $$$ in time and resources utilized in creating duplicates of data.

In most organizations, most resources are fully utilized and certain “nice to have” functions are dropped as a result. One such item that is commonly ignored is data quality!

The benefits of accurate data are clear. They include decreased product development and sales costs, better customer service and increased employee productivity. However, building the business case in order to launch a data quality management initiative has traditionally been a challenge!

As organizations face stiff demand and need to churn out products and services faster, there is an increased demand placed on the availability of good data to support faster and better decision making! Without a doubt, data has become the raw material of the information economy, and data governance is a strategic imperative.

In addition, increased requirements on timely and accurate data placed by regulatory compliances like (SOX, FDA 21 CFR Part 11, RoHS, WEE, Reach, HiPAA, Osha) have stretched the capabilities of process and business systems owners.

These demands have resulted in the development of data governance and technological solutions like MDM. Master Data Management (MDM) is the technology, tools, and processes required to create and maintain consistent and accurate lists of master data.

Some organizations hope to improve data quality by moving data from legacy systems (or consolidating data silos) to (ERP) and (CRM) packages. Other organizations use data profiling or data cleansing tools to unearth dirty data, and then cleanse it with an extract/transform/load (ETL) tool for data warehouse (DW) applications.

A word of caution: unless data quality and governance is approached from a top down manner with alignment from all levels, we will not be able to achieve accurate, complete and timely data! A program to address data quality is not to fix a business system or application, neither is it an implementation of technology solution like MDM, BI, DW. This program is to fix behavior, flow of information across the enterprise and improve operational effectiveness, efficiency, control and success. Merely focusing on technology will result in the same problems but in a different application.

"Disclaimer: The views and opinions expressed here are my own only and in no way represent the views, positions or opinions - expressed or implied - of my employer (present and past) "
"Please post your comments - Swati Ranganathan"