888.4LEANBI (453-2624)
Datasource Consulting is a well-respected consulting firm specializing in Business Intelligence and Enterprise Data Management. We pride ourselves on delivering high quality solutions for our clients. We routinely develop and publish webinars on trending topics and best practices within the industry.
At the core of every credit union is its members. Credit unions are consistently challenged to attract new members, maximize revenue opportunities across its membership, increase wallet-share, and retain existing members. With so many options in the market for deposit accounts, credit cards, loans, mortgages, and more offered from a variety of different financial institutions, how can your credit union differentiate itself from the competition?
The answer is building stronger member relationships by providing an exceptional member experience through your data via our Member 360 solution!
The return on investment in an enterprise data and analytics solution is immense, paying for itself in member profitability and improved member experience while lowering costs and creating a richer environment for informed decision-making to achieve strategic outcomes.
Our solution includes several accelerators that can be customized to leverage your technology stack both in the Cloud or on-premise:
• Template-based data governance and data quality operating models
• Proven Credit Union architecture framework with flexible data models and data integration mappings
• Pre-designed Credit Union self-service reporting and dashboards
• Pre-built predictive analytic models and business drivers
Few processes have a greater impact on an organization’s bottom line than those that make up the order-to-cash cycle. Broken OTC leads to wasted time and effort, cash flow deficiencies, and decreased customer satisfaction, which translates to billions of dollars down the drain every year worldwide. Optimizing OTC is the solution, but it requires solid processes and systems that talk to each other. A holistic approach to OTC – one that also includes data management, modern data architecture and change management – will immediately enable your organization to increase revenue, improve operational excellence, and brighten the customer experience.
As banks and financial services firms re-orient to an increasingly digital and tech-savvy customer base, there is a huge focus on creating a customer-centric organization that is able to cut-through organizational silos driven by product, channels, locations etc. and put the customers’ needs first.
As firms embark on this digital transformation roadmap, there is a growing realization that a strong foundation of clean data is the most critical component of this journey. When banks leverage data as a strategic asset they are able to establish core processes and create an analytics-driven culture that is able to improve customer experience, automate business processes, reduce costs, and manage risk, while ensuring regulatory compliance. Customer data is especially important for these outcomes, but the journey to a true omni-channel customer view can be challenging.
In this presentation, we’ll outline the roadblocks to watch for during an omni-channel customer view endeavor, and how a data strategy featuring MDM and Analytics solutions can help you solve them. Join us to learn:
- The effects of siloed and poor quality data, legacy applications, and other impediments on your customer view
- Best practices for defining an optimal data strategy
- Using MDM to harmonize master and reference data for a single version of the truth
- Applying Analytics to capitalize on your master data with actionable insights
Introducing The More You Snow! As a Snowflake partner, we’re excited to share with you our success stories, tips, and benefits of Snowflake’s data warehouse built for the Cloud. This is a monthly webinar series, and each session will highlight a different aspect of the software and our partnership.
Thursday, September 19th at 11am MST
In the race to keep up with customer expectations, market competition, and new compliance regulations, businesses today can’t afford to lose time and revenue searching through duplicate records and trying to understand their customer from disparate data. In this presentation, John Lieto of Wolters Kluwer and Bryson Dunn of Datasource Consulting share their success story of the MDM initiative for Wolters Kluwer’s corporate compliance division, and how this solution enabled the company to:
- Glean a deeper understanding of the customer with a ‘Golden Master’ customer record
- Improve processes by de-duplicating, preventing future duplication, and providing real-time Data Quality feedback to users
- Clean up their data by retiring or purging bad records and cleaning bad email addresses
- Obtain clear insights into their data with system enhancements, data management tools, and customer hierarchies
- Establish a Data Governance structure to define ownership and management
Migrating your data to the Cloud is no longer a question of if, but when. The advantages for architecture, flexibility, and cost model mean that today, businesses widely consider Cloud the preferred approach.
In this presentation, we will explore a Cloud readiness checklist, vital questions to answer before beginning your Cloud journey, and tips for engaging critical groups within your organization. Preparation is a critical effort required for a successful Cloud project.
One of the many benefits of moving your data warehouse to the Cloud is the ability to integrate best-of-breed services to meet your business needs. With traditional architecture, changing a data warehouse is a daunting effort, often unfeasible due to complexity. In this webinar, Datasource’s Data Integration and Cloud Competency Directors will walk listeners through a customer’s recent shift to moving to Snowflake Cloud software from legacy technology. Join us for an educational presentation on lessons learned, recommendations, and the roadmap to migration success.
Every organization needs a data strategy; however, the strategy is not the same for every organization. An effective data strategy will enable the company to achieve corporate goals and drive revenue opportunities.
In this educational presentation, Datasource’s Solomon Williams, Director of EDM Solutions, will review key components of a data strategy and take a closer look at:
- Defining your company’s data strategy
- Avoiding a one-size-fits-all approach
- Shifting focus from wholesale application development
- How to obtain business and technology buy-in
- Best practices for data strategy implementation
A fundamental component to any organization’s data program, a robust Data Governance operating model improves the success of numerous Enterprise Data Management initiatives. In this webinar, Nancy Couture, Datasource’s Data Governance Competency Director, will detail the components of a robust Data Governance operating model, and identify how this capability can be leveraged to enable and enhance other EDM initiatives, including:
- Metadata management
- Regulatory compliance
- Master Data Management (MDM)
- Data Quality management
- Data lifecycle management
- Overall data management
The California Consumer Privacy Act of 2018 (CCPA) is the most broad-reaching data privacy legislation enacted in the United States. CCPA establishes new rules to govern how businesses handle data throughout California - companies that store large amounts of personal information will be required to disclose the types of data they collect, the purpose for the data collection, how the data will be used/processed, as well as ensure consumers can opt out of having their data sold. Additionally, CCPA expanded organizational responsibilities pertaining to individual rights, accountability, and governance.
The significance and expected impact of California Consumer Privacy Act is reflected in the rapidly-approaching effective date of January 1, 2020, the fact that the legislation will be enforced by the Attorney General, and the steep fines for non-compliance.
In this educational presentation, Solomon Williams, Datasource’s Director of EDM Solutions, will detail a comprehensive approach to address CCPA compliance requirements across people, processes, and technology, and how to fold those requirements into a holistic data privacy strategy.
The goal of Master Data Management (MDM) is to establish a single version of core entities which are critical to the business of an organization. Just beneath the surface of this seemingly straightforward objective are two common usage styles (analytical and operational), each of which has unique value propositions, intricacies, and challenges. In this educational presentation, Bryson Dunn, MDM Competency Director at Datasource, will explore the evolution of operational MDM and the considerations involved when maturing an analytical solution in support of operational use cases. Webinar attendees will also learn:
- Similarities and differences between Analytical and Operational MDM
- Value propositions and challenges of each style
- Design patterns, tools, and best practices
We all know that Data Quality is crucial to any company's data management program, but how do you build a data quality program based on business value? After all, every initiative is only as good as the value it provides.
In this presentation, Sally McCormack, Datasource's Data Quality Competency Director, will elaborate and provide a roadmap to help attendees:
- Understand the cost of having low-quality data
- How to determine your ROI on a data quality implementation
- How to communicate the cost to value to your stakeholders
- Leverage the added benefits of implementing a data quality program within your data governance organization
Data requires a story; all stories require data.
In the digital economy, data is the new oil. Discovering a new insight from analyzing data is exciting – and just the beginning of the road to improvements in organizational performance. Getting started on that road requires presenting your data with a story that is compelling and builds support for your plan (and your career).
This two-part series on data storytelling will cover how to marshal your discovery into relevant points – the What, the So What, and the Then What – and how you can deliver them as a narrative to any audience in a way that resonates and inspires people to action. Designed to help both technical and non-technical thought leaders, the audience will learn techniques that are applicable to making any persuasive presentation that is backed by data and/or solid facts:
Part 1, Data Distillation: How to use visual and conceptual techniques to sort disparate data into key messages, based on context, content, and business implications.
Part 2, Data Story Delivery: How to target an audience and deliver a compelling narrative that builds support, engagement, and action.
A crucial element of most of today’s Enterprise Data Management solutions, a Data Warehouse enables an organization to enjoy increased data quality, improved business intelligence, and timelier access to data, among other benefits. An integrated Master Data Management (MDM) solution improves upon traditional Data Warehouse solutions by correlating data across siloed applications, improving segmentation and hierarchical aggregation, and empowering the business to directly manage this data without IT involvement.
In this educational presentation, Bryson Dunn, MDM Competency Director at Datasource, will explore:
- What is MDM?
- What is a “Golden Record”, and why is it important to your business?
- MDM implementation styles and architectural considerations
- Where MDM meets Data Warehousing
- Benefits of a consolidation style hub, and a reference architecture
Data requires a story; all stories require data.
In the digital economy, data is the new oil. Discovering a new insight from analyzing data is exciting – and just the beginning of the road to improvements in organizational performance. Getting started on that road requires presenting your data with a story that is compelling and builds support for your plan (and your career).
This two-part series on data storytelling will cover how to marshal your discovery into relevant points – the What, the So What, and the Then What – and how you can deliver them as a narrative to any audience in a way that resonates and inspires people to action. Designed to help both technical and non-technical thought leaders, the audience will learn techniques that are applicable to making any persuasive presentation that is backed by data and/or solid facts:
Part 1, Data Distillation: How to use visual and conceptual techniques to sort disparate data into key messages, based on context, content, and business implications.
Part 2, Data Story Delivery: How to target an audience and deliver a compelling narrative that builds support, engagement, and action.
Many organizations take a “top-down” approach to their data governance programs and view data governance with a broad lens from the onset. While this method can work in some cases, the complexity, cost, and enterprise-wide scope often prolong the initiative, drive it over budget, or make it largely ineffective. In this educational presentation, Datasource’s Russ Starck will share compelling evidence of why the approach of “bootstrapping” your data governance program is the way to go.
This bottom-up method involves choosing projects, processes, and business domains experiencing the most pain and therefore in need of immediate data governance. The initiative can then be streamlined out to the broader organization for a ripple effect of efficiencies and automation. In this webinar, the audience will learn the benefits of:
- Growing the initiative iteratively from the bottom-up
- Focusing your initial efforts on the departmental level: working in close collaboration with the business users who are closest to the data
- Documenting the procedures you create and incorporating them into the core policies and procedures
- Managing the balance between the desire for C-level involvement and the need to continue building from the grassroots level
The Cloud computing era continues to gain momentum and is positioned as the most efficient way to host a Business Intelligence solution. In this educational webinar, we’ll explore a Cloud BI use case across two different technology stacks, dive into user experience from implementation to data discovery, and share lessons learned for a smooth transition to the Cloud.
BI in the Cloud leverages rapidly-evolving Cloud services to deliver information quickly and securely, in a more cost-efficient manner. The technology replaces depreciating hardware and systems administration expense with a lower-cost solution, allowing architects to focus on data delivery instead of backend hosting. This is especially valuable for organizations that need to separate analytic solutions and those looking to decrease their cost of BI solution ownership.
In order to entice Cloud migration, Cloud BI vendors have greatly reduced the barrier to entry by providing zero-cost deployment capabilities, rapid scaling to align with business needs, advanced analytics, seamless integrations with disparate data sources, and new capabilities delivered weekly.
Datasource experts will demonstrate several Cloud BI options and discuss the value each solution can bring to your business. Attendees will learn how upgrading to a Cloud BI architecture will restructure licensing costs, reduce total cost of ownership, and enhance the organization’s analytic horsepower to decrease topline expenses and increase profits.
Over the past few years, Microsoft has enhanced its BI tools to incorporate a comprehensive offering to serve all user types – “BI for Everyone”. Their new Tabular cube technology aligns with this strategic decision.
This content offers insight for Microsoft BI Developers and Architects as well as BI Directors and Leads and will benefit companies who have invested in this technology or are considering adding it to their technology portfolio. In this webinar, you will learn BI best practices and see a deeper dive into the Microsoft SSAS Tabular cube modeling offering for Self-Service and Gold Standard reporting. Greg Meehan, a Business Intelligence Consultant with top-end SSAS Tabular expertise, shares key advice, tips and experiences utilizing the tool as it has matured over the last few years with newer-in memory technology.
A foundational sub-set of master data and a critical link between master data domains, reference data is shared by multiple constituencies, domains, and systems across an enterprise. Without a sound approach to managing your organization’s reference data, an MDM implementation cannot be successful. It is often the most overlooked component of an MDM solution.
In this webinar, Datasource’s Solomon Williams and Orchestra Networks’ Conrad Chuang will expand on the criticality of reference data and provide educational content on:
- The importance of Reference Data Management
- Best practices in a Reference Data Domain that the entire organization can leverage
- Use cases and lessons from the trenches
- Q&A with the experts- ask your questions about approaches and implementation strategies
On May 25, 2018, the General Data Protection Regulation (GDPR) will replace and expand the Data Protection Directive to be globally applicable, regulating personal data of EU citizens as well as the exportation of personal data outside the EU. Any company with employees or customers in the EU will have to comply, regardless of location.
A strategic Data Governance program can be leveraged to address many aspects of GDPR. In this webinar, Nancy Couture, Senior Director of Delivery Enablement, will expand on GDPR requirements and discuss:
- How the key points of GDPR strategically align with the goals of Data Governance
- How organizations can leverage Data Governance to identify their data footprint and hold people accountable for the quality, transparency, and knowledge around their data
- Using a solid Data Governance capability for a smoother compliance roadmap and more!
Organizations today face the question of how to optimize their data warehouses, save money, increase business value, and grow end-user utilization.
Cloud technologies have evolved to the point where these environments are affordable, secure, and scalable to meet business needs. The question is: "What is the best mix?" Is it on-premise, hybrid, or cloud? And. how should a cloud migration be approached?
In this webinar:
- common concerns
- roadblocks
- considerations on cloud providers
- a methodology for migrating your organization's data warehouse to the cloud
Hertz is one of the largest car rental companies in the world, with three global brands and 11,000 locations in 150 countries. Hertz has a mission to be the lowest cost, highest quality, and most customer-focused rental company in every market. However, high growth over the years, the current state of internal programs, processes, and new complexities from acquisitions have presented challenges. To address these challenges, enable growth and efficiencies, and meet the objectives of their mission, Hertz has initiated a complete digital transformation.
Attendees will learn:
The guiding principles of the program.
An overview of the reference data architecture.
The multi-domain approach.
The criticality of Data Governance in an MDM solution.
Lessons learned to date.
In this session, the presenters will step through the main business drivers associated with the Data Quality and Data Governance initiatives, the tools needed and leveraged and, most importantly, how to attract and engage key business owners of the data to support a long-term Data Governance organization.
Solomon Williams and Conrad Chuang discuss the changing demands on master data management programs, its ramifications, the change in perspective, and the impact on today’s enterprises. Master data management programs used to be domain-specific - you were either working on a customer or product MDM. But today, MDM has shed its domain-centricity and is now process-oriented. Modern multi-domain MDM programs support key business processes, such as Idea to Opportunity, Opportunity to Market, Market to Order, and Order to Cash.
Metadata Management is one of the most overlooked data management disciplines. Its complexity and often-underestimated benefits are typically the culprits. However, Metadata Management is a foundational discipline that can further mature other data management initiatives, especially Data Quality Management, Data Governance, and Data Stewardship. This presentation details the many aspects of Metadata Management as well as its positive impact when applied in coordination with other data management disciplines.
Learn how Data Governance can deliver significant value quickly, build better relationships with BI teams, and defend your company against the coming Analytical Zombie Apocalypse. Based on real-life experience in organizations like yours, this session shows you how to enliven your reporting, analytics, and metrics efforts through your Data Governance capabilities.
In this presentation, we'll address the plague of Executive Attention Span Disorder (EASD) and proven strategies to make sure your long-running enterprise data program (BI, Data Governance, Metadata, Data Warehouse, etc.) won't be abandoned after the first three months.
You know you need Data Governance, but how do you convince the people in your organization to invest in a long-term program?
Many prefer to focus on things that can be completed in 90 days or less and deliver value immediately. The best-designed DG program can fail or never launch in the first place when not sold properly. Get some great ideas from our hard-won experience selling, designing, and running successful enterprise data governance programs.
A common goal of MDM implementations is to create an accurate, uniform, 360-degree view of the customer. Accurate, consistent views provide for an enhanced customer experience for multi-channel interactions as well as cross-sell, up-sell opportunities. The next logical step for advancing the usefulness of the master data record is to include the social media touchpoints of the customer. By leveraging social media data to enrich the existing MDM customer record, a direct line of marketing has just been opened to the customer. However, this is not without its challenges.
This presentation defines the “value proposition” of social media data enrichment by discussing:
In this webinar, we define reference data and Reference Data Management and identify the sources of reference data. We discuss how a lack of managed reference data impacts an organization and several key aspects of a reference data management capability.
In this webinar, Datasource Master Data Management Practice Lead, Solomon Williams, provides a high-level introduction to both master data and MDM, while discussing the many sources of master data. He also looks at how master data impacts an organization and provide several use cases for MDM.
"Is Data Quality in Your DNA" is the case study of Illumina, a leading manufacturer of tools for genetic research. The case study follows Illumina as they migrate over 100 legacy ERP systems into a single ERP environment.
InfaTools Stage Mapping Accelerator is a tool Datasource leverages to enable faster project delivery. It is a streamlined application that provides seamless interaction with Informatica PowerCenter to generate stage mappings, sessions and workflows, eliminating the need for additional applications.
In today’s age of agile development, increasing pressures to deliver, and the importance of developing high quality models, how can we afford not to look at proven models that can save time, reduce cost and increase quality?Universal Data Models and Datasource Consulting have now teamed up in a partnership to provide even more outstanding solutions.
This webinar will discuss tips and tricks for implementation and provide insight into why data quality is a crucial component of any Enterprise Information Management (or EIM) solution. The webinar is quick, concise and to the point so you will quickly know if you should focus on the importance of Data Quality for your enterprise.The importance of Data Quality is rising in today’s enterprise business world. According to TDWI, the cost of bad data is more than $600 billion annually in the United States. Low data quality can cause a variety of challenges for your business including:
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. Cloud-based data warehouse solutions have been gaining a lot of attention. Amazon Redshift is a leader in this space, with its cloud based, MPP analytic columnar database offering.
Business Intelligence/Data Warehouse (BI/DW) solutions often require a multi-million dollar investment for even modest size companies. Senior management certainly concedes that timely, reliable information is essential to business operations, but they are challenged to demonstrate real cost savings or revenue creation directly attributable to the Business Intelligence and Data Warehouse investment. In these challenging economic times, BI/DW programs are coming under increasing scrutiny to justify their budgets relative to competing investments.The presentation will use two Business Intelligence & Data Warehouse case studies to show a measurable financial return that far exceeds the initial investment. These two examples will be used to illustrate a general protocol for scoping and managing BI/DW projects to ensure that the value proposition is quantitative and demonstrable.
There is much interest in leveraging agile methods for delivering DW/BI applications cheaper, faster and better than traditional approaches. The agility arises from narrowly focused project scope, iterative sprint development efforts, and active participation of business users to prioritize scope and functionality of successive iterations until the desired functionality is achieved.However, many agile projects minimize the data modeling discipline and architecture principals in order to preserve their short and focused sprints. This often results in a portfolio of disparate, point solutions with considerable data redundancy and data inconsistencies across BI applications. Although each application may serve its purpose and provided business value, the portfolio becomes unmanageable from a maintenance and integration perspective.Agile development and enterprise integration need not be mutually exclusive. This webinar will outline an architecture and methodology to achieve both by leveraging data modeling techniques, tools, and metadata management.
Data Masking is a growing concern for major corporations.Data security, regulatory and audit compliance require organizations to protect sensitive data such as personally identifiable information (Pii). Having been involved in numerous projects using data masking, Datasource Consulting understands the complexity, challenges and nuances for successfully completing such projects.Join Business Intelligence Consultant, Yelena McElwain, for an informative session that will discuss many of the practical approaches and potential pitfalls associated with Data Masking. She will take you on an in depth tour of the rules and techniques commonly associated with data masking projects.
Much attention has been paid, appropriately, to building the data warehouse. Architecture, ETL, BI tools, appliances, and agile methodology are certainly relevant to building a data environment optimized for analysis and reporting. However, this discussion will focus on using the data warehouse.Predictive Analytics leverages advanced statistical modeling to produce powerful forecasting tools. With such tools, one can produce reliable sales forecasts, define staffing needs, set inventory levels and refine manufacturing schedules. Predictive models project the optimal replacement frequency of aircraft parts avoiding expensive or dangerous in-service failures. Actuaries use statistical models to set competitive insurance premiums that cover health care costs with reasonable profit margin.The presentation will start with a reminder look at simple linear regression and correlation and expand into more advanced analytics such as multivariate regression and principal component analysis. It will offer suggestions for initiating advance analytical techniques in your organization and avoiding many of the common pitfalls.
There has been a common and consistent complaint across nearly every organization – “It takes too long to get the information I need to make decisions!” Businesses are frustrated with the complexity of their data.This then turns us back to why we began investing in a BI solution to begin with, eliminating multiple versions of the truth. Have we even made any progress? Does this sound familiar?This webinar will go over innovative changes that can be made to help put the business back into Business Intelligence.You may have heard a lot about Agile development methodologies and how they can help increase the velocity of your BI program. This session will first focus on the tools, methods, and best practices that can help your BI program become more agile. Then we’ll discuss how Data Virtualization can be used to dramatically shorten the lifecycle of information delivery while maintaining data governance and helping your BI program become more agile.