London 18 September 2014 – techUK
One of the key challenges faced by data professionals when implementing a data management or data modelling initiative is convincing senior stakeholders of its value and demonstrating the potential business benefits of the project. Embarcadero invites you to attend a breakfast briefing to hear how its customers are using data modelling as a core element of key business initiatives and as a result are getting value from their enterprise data.
Speakers will include a Head of Enterprise Data Warehousing and Product Managers from Embarcadero.
Realize measurable huge cost savings with tools built for data modeling
Using Visio or Excel for data modeling is like trying to drive a nail with a rock. Sure, the rock is a simple, readily available tool, but if you’re trying to build a house the process can lead to cost overruns, project delays, and a fair amount of personal pain.
This expert whitepaper will show you the true, cascading costs of using tools such as Visio or Excel for data modeling. You’ll learn how using tools built specifically for data modeling can help you:
- Keep your modeling capability in lockstep with your databases and data warehouses
- Keep MDM and data governance projects on schedule
- Reduce the high cost of data integration
- Achieve up to a 387% ROI in year one
There’s an easy way to justify the cost of moving to better, easier data modeling tools. Find out how Get Whitepaper
Is data modeling outdated? This excerpt from the book Data Modeling for MongoDB: Building Well-Designed and Supportable MongoDB Databases by Steve Hoberman argues that data modeling concepts are still vital to business success and introduces useful terminology and tips for simplifying a complex information landscape with MongoDB applications. Hoberman is the most requested data modeling instructor in the world and has educated more than 10,000 people across five continents about data modeling and BI techniques. In this excerpt, he emphasizes the necessity for businesses to implement data modeling concepts and explores a variety of business uses for data models.
By Steve Hoberman
This excerpt is from the book Data Modeling for MongoDB: Building Well-Designed and Supportable MongoDB Databases, by Steve Hoberman. Published by Technics Publications, LLC, Basking Ridge NJ, ISBN 9781935504702. Copyright 2014, Technics Publications.
Here’s a great article on how Data Modelling still applies to the Big Data world and especially with regards to NoSQL.
Below is an extract from the article, the link for the full article can be found below:
Data Modeling Still A Priority
Data modeling, then, still has an important role to play in NoSQL environments. “The data modeling process is always there,” he says. You can look at that role in a simple way, van der Lans explains, by thinking of it as a process that leads to a diagram. In the process of creating the diagram, you are trying to understand what the data means and how the data elements relate together. Thus, “understanding” is a key aspect of data modeling.
Just as is the case when you are doing data modeling for SQL environments, data modeling for NoSQL requires doing the same homework: Talk to end users and read reports to come up with some logical model that specifies the structure and meaning of the data. That’s the business-oriented step, he notes. “The moment we want to interpret data we have to understand it,” he says. Implementing that logical model – the physical and technical aspect of data modeling – is what changes dramatically in NoSQL environments compared to SQL environments.
In the SQL environment, the data modeling process that leads to such an understanding lives inside the database server. In NoSQL environments, however, the data modeling ends up in the code of the application that reads the data, van der Lans says. “Twenty years ago, if you would do data modeling, the result would always be a database structure – tables and columns.” In today’s NoSQL environments, “what will happen is the data model ends up as lines of application code….The structure is there but in the lines of application code.” Because of that approach, it revolves around just changing how you want to look at the data, there’s no requirement to reorganize the physical database.
We have all have spent a lot of time tuning indexes – it can be frustrating and time consuming. Adding indexes often seems like the right solution at the time – until no good result is achieved.
Join Martin Hubel, Database Consultant- DBA Evangelist, and Embarcadero’s Scott Walz as they discuss best practice strategies and techniques for avoiding index tuning pitfalls. Learn and see the best techniques to tune and optimize indexes.
Register for the webinar to learn:
- Which indexes to drop
- 3 Main “Truths” when tuning applications
- How to avoid index tuning from going awry
Tuesday July 29, 2014
7:00 AM PDT / 9:00 AM CDT / 10:00 AM EDT
11:00 AM PDT / 1:00 PM CDT / 2:00 PM EDT
Webinar presentation by Martin Hubel – Database Consultant & DBA Evangelist, MHC Inc.
About the presenters:
Martin Hubel – Database Consultant & DBA Evangelist, MHC Inc.
Martin Hubel has both extensive and intensive experience in database design, application architecture, and system administration for relational database management systems. For the past 20 years, Martin has consulted and taught more than 400 clients worldwide to use database management systems more effectively.
Scott Walz has more than 20 years of experience in the area of database development and currently serves as the Director of Software Consultants for Embarcadero Technologies. Prior to joining Embarcadero 12 years ago, Scott served as a development lead for Louisville Gas & Electric. He holds a bachelor’s degree in computer information systems from Western Kentucky University.
DB Optimizer is a database optimization tool that maximizes database and application performance, reliability, and availability by profiling, tuning, and load testing SQL code.
Part 1 of 5 | 3 min 46 sec
Get started with a brief introduction to DB Optimizer’s features and capabilities, an overview of the GUI and how to get started by adding data sources.
Part 2 of 5 | 1 min 29 sec
Take a look at the SQL editing capabilities in DB Optimizer’s IDE. Organize the result sets, scroll through value and rewrite queries.
Part 3 of 5 | 10 min 24 sec
Learn how to use DB Optimizer to tune SQL. Generate cases and perform detailed analysis with visual cues to quickly optimize SQL code. Also, watch how Visual SQL Tuning gives DBAs an advantage when it comes to tuning!
Part 4 of 5 | 5 min 16 sec
See DB Optimizer profile databases to quickly identify bottlenecks and other problem areas. This video shows how DBAs can capture profiling details, address wait times and report via XML files.
Part 5 of 5 | 3 min 54 sec
Watch the demonstration around conducting load testing for performance optimization. Load testing helps makes sure any SQL code is able to handle the rigors of the most hectic database environments.
During the last few years, several database market dynamics have led many people to question the utility of data modeling. In particular, the advent of XML information management, growing frustration with traditional relational database management system (RDBMS) capabilities and vendor relationships, and the expanding influence of cloud platforms have led many organizations to reconsider their commitments to the practice of data modeling. Read this paper for perspectives on why data modeling is more important than ever before, and that organizations that seek to fully leverage database market dynamics must redouble their focus on data modeling.
This document will explain:
- Why the roles for NoSQL and Big Data are broadly misunderstood today
- How market enthusiasm for these loosely-defined domains has challenged traditional data modeling assumptions
- Why data modeling is more important than ever
- How organizations that seek to fully leverage database market dynamics must redouble their focus on data modeling
A White Paper By Joe Maguire and Peter O’Kelly