Browsing articles tagged with " tools"

What’s the Best Data Modeling Tool?

Apr 3, 2013   //   by Karen Lopez   //   Blog, Data, Data Modeling  //  4 Comments

PostIt Data Model

My general advice on this question (and I get it a lot) focus more on fit to your organization and technical environment than on specific features. I should disclose that we partner with most the major data modeling tool vendors, but we don’t make any money from the sales of such tools.  We aren’t a reseller and don’t get any referral fees.  The partner programs I’m part of with these vendors is closer to an MVP or Ace program.

Target DBMS and Development Tool Support.

It depends on what your target DBMSs and data stores are, of course.  While I am a huge fan of modeling for the knowledge, to document the business, and to be able to communicate requirements, ultimately to get the full value of modeling we need to ensure that our modeling products are directly able to support development and implementation activities as well as all the requirement gathering.

I see weaker returns on the value of modeling when the chosen modeling tool cannot be easily consumed by developers in their tools, or when the tool cannot forward and reverse engineer databases.  This also include XML files, non-relational DBs and even the most common database in the world, Excel.

I consider this one factor the most important of all in the post. 

Modeling Not Just Drawing

Every obstacle between the data models and another person’s ability to consume them is an obstacle for the models being integrated into the full development and support lifecycle.  There’s very little chance of Model Driven Development being successful if all that wonderful data model goodness has to be retyped (and reinterpreted) into someone else’s process or products.

A data modeling tool needs to support the highly-iterative nature of data modeling.  Drawing and database diagramming tools rarely have the ability to compare and resolve differences between models and databases.  I do these types of compares many times a day, across many models and databases.

Data modeling tools also allow data architects to design and record important information beyond what might appear in an ERD or DDL.  These features include security, encryption, capacity planning, data privacy and a slew of other critical data requirements.  In shops where a diagramming tool such as Visio is used for data modeling, these important meta data items are typically buried in a spreadsheet or document.  They can’t be easily searched or reported on, nor easily shared across projects.

Vendor Tech Support

When you buy a data modeling tool you aren’t just buying a product – you are investing in a vendor.  Log a few tickets during evaluation.  Learn how well support works.  Remember, the vendor will never treat you better than when you were considering a purchase. If support is slow or non-existent it’s not going to get better after you buy.  Don’t use your sales person as the support person.  They may be great but they might not be your sales person tomorrow. You are evaluating the support process and system, not one person.

If the vendor says that tech support isn’t available to evaluators, ask them why not and tell them you need to evaluate them as well as their product.

Future Support

The technical architecture landscape is changing faster than most vendors can respond.  The good news is that often that means it’s changing faster than your organization can change as well.  However, you need to review the history of releases of the product to assess the responsiveness of the vendor to changes in the industry.  Sure, past performance is not a guaranteed indicator of future behaviour. It is something to consider, though.

Ask to see public roadmaps or presentations about products/features in development.  Vendors can’t always share a lot of details but generally they have information they can share either at a high level or features that are currently in beta testing.

Licensing Options

I don’t know about you, but my work happens in many locations and across many devices. As much as I’d like to buy a separate license for every desktop, laptop and tablet I own, I’m not going to do that. I do most of my modeling on a desktop with three screens because it makes me more productive. However, when I am in a meeting I want to use my laptop or tablet. If the only licensing scheme available is via device or MAC address I either have to buy multiple licenses or remote into my desktop to do the work. Neither one reflects how modelers really work. Licensing options such as floating/concurrent or user-based do.


If common tasks take hours instead of seconds the cost of using the tool may exceed the value it can deliver no matter how pretty your models are.  If publishing models takes more than a few minutes you won’t publish (share) as often. You’ll want to evaluate the full lifecycle of modeling, including forward and reverse engineering, printing, publishing, sharing, exporting, importing…everything a professional data modeler needs to do to get his job done.

Version Control

A team of data modelers (even a team of one + co-workers like developers and DBAs) must have real version control for models.  That means not at the file level but at the object level within the model.  A tiny change in one part of the model can ripple throughout the model completely unseen on the current diagram.  The ability to compare the impact of a change to previous versions of the model is critical. Because data models are typically persisted in a proprietary format third party version control tools like Subversion can only detect changes at the full file level, not at the object level.

A drawing tool isn’t going to have these features.


If the tool doesn’t fit well within a corporate IT environment it’s going to be difficult to get support from your own IT resources. Support for the types of requirements you’d have for a business application is key.  Active Directory, password complexity requirements, installation permissions required, target DBMSs supported, data file location configuration are just some of the examples of things to evaluate.

Your data models are production data for you, the modeler. You want production level support from your company’s IT resources.  If the product isn’t enterprise ready, you’re going to have a tough time getting that support.

Community Support

A vendor that participates and supports professionals in the field is one that is demonstrating how much they want to see users of their products succeed.  Do they have product evangelists who help users succeed by blogging, creating videos, speaking at events, etc.?

Does the vendor sponsor user groups? Conferences you attend?  Do they have a user group they fund?  Do they they have a form product advisory group that helps product management understand the needs of data modelers in the field?

A vendor that gives back to the community is more likely to deliver the product features you need, plus help you be successful at data modeling.


All of the above features come at a cost, as do all enterprise class tools.  But organizations are going to pay those costs whether or not they use Excel, Visio or FreeDrawingTool to manage their data architectures or a real data modeling tool.  In my experience the real costs using these “cheap workarounds” greatly exceeds the costs of using a real enterprise data modeling tool due to lost productivity, data modeling errors, lack of reuse and wasted business user time in recreating the same data models over and over again. 


Data modeling is more than diagramming tables.  ERDs show about 10% of the information in a professional data model.  You need tools that support your environment and data modeling goals.   It’s less expensive, faster to model and your models will continue to deliver benefits for years and decades afterwards.

Speaking: You’ve Just Inherited a Data Model – Now What? 16 June Webinar

Jun 16, 2011   //   by Karen Lopez   //    //  No Comments

On Thursday, 16 June 2011 2PM ET I’ll be presenting a webinar for the Professional Association for SQL Server (PASS) Data Architecture virtual chapter (DArchVC) on working with industry standard data models.  Attendance in the webinar is free.  No pre-registration is required; just put the Live Meeting link on your calendar for 16 June 2011, 2PM EDT.

The DArchVC chapter is fairly new and I have been somewhat involved helping get it set up.   The people doing most of the work are:

From our chapter page:

Data Architecture underpins just about everything we do in IT.  Without a clear understanding of how data is structured, there is no reliable way to derive meaning from it.  Data Architecture drives the design and governance of data systems, and thus should be of interest to any data professional.  Data Architecture provides the blue-prints that we all share, whether we be DBAs, data integration developers, database developers, data warehousing professionals, data presentation developers or client-side application developers persisting data for later retrieval.

The PASS Data Architecture Virtual Chapter will focus on data architecture concerns as they impact users, developers and DBAs on the Microsoft SQL Server platform.  We want to make data architecture accessible to all data practitioners, and drive the point home that Data Architecture is a set of practices and a body of knowledge that overlaps almost all database professionals to some degree.

The session abstract:

You’ve Just Inherited a Data Model – Now What?

The good news is that someone else has done the hard work of architecting a data model and you just have to take on minor maintenance…or is that the bad news? Or have you been tasked with implementing a pattern or industry standard data model?  Perhaps a team member has sent the world’s best resignation letter and won’t be helping you with the model.  Learn the 5 steps you MUST take before working with a new data model.

Attendees will also receive a detailed checklist for the 5 steps.

Karen López is a principal consultant at InfoAdvisors, Inc., a Toronto-based consulting firm. Karen is a frequent speaker at DAMA, SQLSaturdays and PASS conferences, including 24 Hours of PASS, the PASS Summit and PASS SQLRally.  She has 20+ years of experience in project and data management on large, multi-project programs.  Karen specializes in the practical application of data management principles.

Karen is also the ListMistress and moderator of the InfoAdvisors Discussion Groups at

As usually, I’ll try to keep track of conversations on Twitter.  Let’s use the hashtag #DArchISDM  Join us on Thursday and come prepared with your comments, observations, and questions about working with third party data models.

Subscribe via E-mail

Use the link below to receive posts via e-mail. Unsubscribe at any time. Subscribe to by Email