Datasource Consulting is a well-respected consulting firm specializing in Business Intelligence and Enterprise Data Management. We pride ourselves on delivering high quality solutions for our clients. We routinely develop and publish webinars on trending topics and best practices within the industry.
Solomon Williams and Conrad Chuang discuss the changing demands on master data management programs, its ramifications, the change in perspective, and the impact on today’s enterprises. Master data management programs used to be domain-specific - you were either working on a customer or product MDM. But today, MDM has shed its domain-centricity and is now process-oriented. Modern multi-domain MDM programs support key business processes, such as Idea to Opportunity, Opportunity to Market, Market to Order, and Order to Cash.
Metadata Management is one of the most overlooked data management disciplines. Its complexity and often-underestimated benefits are typically the culprits. However, Metadata Management is a foundational discipline that can further mature other data management initiatives, especially Data Quality Management, Data Governance, and Data Stewardship. This presentation details the many aspects of Metadata Management as well as its positive impact when applied in coordination with other data management disciplines.
Learn how Data Governance can deliver significant value quickly, build better relationships with BI teams, and defend your company against the coming Analytical Zombie Apocalypse. Based on real-life experience in organizations like yours, this session shows you how to enliven your reporting, analytics, and metrics efforts through your Data Governance capabilities.
In this presentation, we'll address the plague of Executive Attention Span Disorder (EASD) and proven strategies to make sure your long-running enterprise data program (BI, Data Governance, Metadata, Data Warehouse, etc.) won't be abandoned after the first three months.
You know you need Data Governance, but how do you convince the people in your organization to invest in a long-term program?
Many prefer to focus on things that can be completed in 90 days or less and deliver value immediately. The best-designed DG program can fail or never launch in the first place when not sold properly. Get some great ideas from our hard-won experience selling, designing, and running successful enterprise data governance programs.
A common goal of MDM implementations is to create an accurate, uniform, 360 degree view of the customer. Accurate, consistent views provide for an enhanced customer experience for multi-channel interactions as well as cross-sell, up-sell opportunities. The next logical step for advancing the usefulness of the master data record is to include the social media touch points of the customer. By leveraging social media data to enrich the existing MDM customer record, a direct line of marketing has just been opened to the customer. However, this is not without its challenges.
This presentation defines the “value proposition” of social media data enrichment by discussing:
In this webinar, we define reference data and Reference Data Management and identify the sources of reference data. We discuss how a lack of managed reference data impacts an organization and several key aspects of a reference data management capability.
In this webinar, Datasource Master Data Management Practice Lead, Solomon Williams, provides a high level introduction to both master data and MDM, while discussing the many sources of master data. He also looks at at how master data impacts an organization and provide several use cases for MDM.
"Is Data Quality in Your DNA" is the case study of Illumina, a leading manufacturer of tools for genetic research. The case study follows Illumina as they migrate over 100 legacy ERP systems into a single ERP environment.
InfaTools Stage Mapping Accelerator is a tool Datasource leverages to enable faster project delivery. It is a streamlined application that provides seamless interaction with Informatica PowerCenter to generate stage mappings, sessions and workflows, eliminating the need for additional applications.
Developing processes and methods to define the handling of data within an organization can be challenging. This can be even more difficult when faced with external regulations or compliance considerations (HIPAA, PCI, SOX). These best practices are critical to ensure data quality, reduce risk and allow organizations to leverage the full value of their enterprise data.At Datasource Consulting, we have worked through these challenges first hand and have helped our clients stand-up efficient and well-orchestrated Data Governance Programs. Let us share our insight by joining us for an informative webinar, Data Governance: Tips & Advice for Building and Strengthening a Data Governance Program.This webinar covers:
In today’s age of agile development, increasing pressures to deliver, and the importance of developing high quality models, how can we afford not to look at proven models that can save time, reduce cost and increase quality?Universal Data Models and Datasource Consulting have now teamed up in a partnership to provide even more outstanding solutions.
This webinar will discuss tips and tricks for implementation and provide insight into why data quality is a crucial component of any Enterprise Information Management (or EIM) solution. The webinar is quick, concise and to the point so you will quickly know if you should focus on the importance of Data Quality for your enterprise.The importance of Data Quality is rising in today’s enterprise business world. According to TDWI, the cost of bad data is more than $600 billion annually in the United States. Low data quality can cause a variety of challenges for your business including:
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. Cloud-based data warehouse solutions have been gaining a lot of attention. Amazon Redshift is a leader in this space, with its cloud based, MPP analytic columnar database offering.
Business Intelligence/Data Warehouse (BI/DW) solutions often require a multi-million dollar investment for even modest size companies. Senior management certainly concedes that timely, reliable information is essential to business operations, but they are challenged to demonstrate real cost savings or revenue creation directly attributable to the Business Intelligence and Data Warehouse investment. In these challenging economic times, BI/DW programs are coming under increasing scrutiny to justify their budgets relative to competing investments.The presentation will use two Business Intelligence & Data Warehouse case studies to show a measurable financial return that far exceeds the initial investment. These two examples will be used to illustrate a general protocol for scoping and managing BI/DW projects to ensure that the value proposition is quantitative and demonstrable.
There is much interest in leveraging agile methods for delivering DW/BI applications cheaper, faster and better than traditional approaches. The agility arises from narrowly focused project scope, iterative sprint development efforts, and active participation of business users to prioritize scope and functionality of successive iterations until the desired functionality is achieved.However, many agile projects minimize the data modeling discipline and architecture principals in order to preserve their short and focused sprints. This often results in a portfolio of disparate, point solutions with considerable data redundancy and data inconsistencies across BI applications. Although each application may serve its purpose and provided business value, the portfolio becomes unmanageable from a maintenance and integration perspective.Agile development and enterprise integration need not be mutually exclusive. This webinar will outline an architecture and methodology to achieve both by leveraging data modeling techniques, tools, and metadata management.
Data Masking is a growing concern for major corporations.Data security, regulatory and audit compliance require organizations to protect sensitive data such as personally identifiable information (Pii). Having been involved in numerous projects using data masking, Datasource Consulting understands the complexity, challenges and nuances for successfully completing such projects.Join Business Intelligence Consultant, Yelena McElwain, for an informative session that will discuss many of the practical approaches and potential pitfalls associated with Data Masking. She will take you on an in depth tour of the rules and techniques commonly associated with data masking projects.
Much attention has been paid, appropriately, to building the data warehouse. Architecture, ETL, BI tools, appliances, and agile methodology are certainly relevant to building a data environment optimized for analysis and reporting. However, this discussion will focus on using the data warehouse.Predictive Analytics leverages advanced statistical modeling to produce powerful forecasting tools. With such tools, one can produce reliable sales forecasts, define staffing needs, set inventory levels and refine manufacturing schedules. Predictive models project the optimal replacement frequency of aircraft parts avoiding expensive or dangerous in-service failures. Actuaries use statistical models to set competitive insurance premiums that cover health care costs with reasonable profit margin.The presentation will start with a reminder look at simple linear regression and correlation and expand into more advanced analytics such as multivariate regression and principal component analysis. It will offer suggestions for initiating advance analytical techniques in your organization and avoiding many of the common pitfalls.
There has been a common and consistent complaint across nearly every organization – “It takes too long to get the information I need to make decisions!” Businesses are frustrated with the complexity of their data.This then turns us back to why we began investing in a BI solution to begin with, eliminating multiple versions of the truth. Have we even made any progress? Does this sound familiar?This webinar will go over innovative changes that can be made to help put the business back into Business Intelligence.You may have heard a lot about Agile development methodologies and how they can help increase the velocity of your BI program. This session will first focus on the tools, methods, and best practices that can help your BI program become more agile. Then we’ll discuss how Data Virtualization can be used to dramatically shorten the lifecycle of information delivery while maintaining data governance and helping your BI program become more agile.