Browsing articles from "February, 2012"

The Job of the Future

Feb 8, 2012   //   by Karen Lopez   //   Blog, Generations, Professional Development  //  2 Comments

I agree with this:

The intellectual equipment needed for the job of the future is an ability to define problems, quickly assimilate relevant data, conceptualize and reorganize the information, make deductive and inductive leaps with it, ask hard questions about it, discuss findings with colleagues, work collaboratively to find solutions and then convince others.

Robert Reich

Do you think this we are educating our future generations for this sort of job of the future?  Are we creating the type of learning and living environments that encourage our kids to tackle these kinds of tasks?

Preview to Tuesday, 7 February Webinar on Tailoring Data Models

Feb 7, 2012   //   by Karen Lopez   //   Blog, Data, Data Modeling, Database, Snark, Speaking  //  No Comments

Here’s a preview data model for my webinar on 7 February (yes, today!) at 1PM EST.  My topic is Help your Business Love its Data (Models): Tailoring Data Models for your Audience and it’s Part 2 of a three part serious on Getting Down to Business, sponsored by CA.



It’s free to register at

This is what I’ll be covering:

7 Feb 2012
1:00 p.m. – 2:00 p.m. EST

Duration 60 min.

There’s no one data model hiding in your modeling tool. There are actually thousands of them – not just multiple data model files, but different views and presentations of the same data model, each one ready to be used by different purposes and outcomes. In this session, Karen Lopez will discuss the steps in of preparing and presenting the “right” data model for the right audience, as well as making them accessible via the web. We will also cover the 10 tips for ensuring that your audience is happy they attended the data model presentation and looks forward to attending the next one.

As usual for my presentations, this will have a bit of snark and talk about good things to do in business data models an show some anti-patterns for modeling, too.  That’s probably where the snark will come it.  Can’t guarantee it, but it sounds about right.

Bring your ideas about tailoring the presentations  of data models or any type of design.  Oh, and if you have any action figures, bring those, too.

Deadlines and Data Architects: You know the Date, Do You Know the Deliverable? #MemeMonday

Feb 6, 2012   //   by Karen Lopez   //   Blog, Data, Data Modeling, Database  //  7 Comments

It appears I’m an expert at deadline-driven accomplishments.  I mean, it’s meme Monday and even though I’ve known about the deadline for today’s post for weeks, I’m writing it at lunch on said day.  I’ve been thinking about what I’d write about for all this time, so it’s not like I’m just starting…but still, the deadline is what making this post happen.  I wish I could be the type of person who gets stuff done because it just needs to be done, but deadlines are what drive me. 

The ultimate inspiration is the deadline.
-Nolan Bushnell

I’ve been thinking about some experiences I’ve had working with other data architects and working with deadlines.  Usually as part of a team, we don’t get to set our own deadlines; they are set by a project manager or if we are lucky, via negotiation with the DBAs, developers and other architects on the team.  Our deliverables are typically the input into their deliverables, so their success is depends our getting our deliverables to them.  In my experience, though, it’s too common for data architects and data modelers to forget what deliverable those teams are waiting for or for them to make it available in the right format. 

In his post for #mememonday, Thomas LaRock (@sqlrockstar | blog ) give his best tip for working towards a deadline: work backwards.

What is Your Deliverable?

imageThis is the most frustrating question.  It should be very apparent to a data architect what deliverables the team needs from them.  But time after time I see modelers focused 99.999% on the data models.  Yes, your data models should be beautiful.  They should be complete, correct and pretty.  Every single one of them should get a trophy. All the meta data you want to capture about them should be in the data model.  Do you think, though, that your development DBA and developers are tapping their feet waiting for the PDFs of the data models?  Or do you suppose that they just might want that alter script to run in their environment so that they can get to work on their deliverables? 

Ideally, the team wants both.  They want the model to be created and they want the scripts with all the changes needed for their next deliverable.  I can assure you, though, that the scripts are what they want first.  I can also assure you that no one will love the data models more than you do, Mr. Data Architect.  So embrace the love you feel for the data models and funnel it into getting the "real" deliverable done:  DDL.  Sure, it hurts a bit that they don’t love your models as much as you do, but they will learn to love them because they produce beautiful databases.

On most my projects (mostly development ones) I never lose sight of the fact that ultimately, the data models are a method to get to good quality databases and data.

What Makes for a Good Deliverable?

I worked with a data architect who was very passionate about her data models.  She spend days making them beautiful.  She filled them with all kinds of nifty meta data to help people understand their data better.  She wrote definitions that were paragraphs long, then added more notes to them just to make it even better.  She often missed review meetings because she was too busy making a more beautiful data model.  She missed deadlines for delivering DDL because she wasn’t quite sure on how to do that.  She researched data modeling layout techniques to make models more readable.  She demoed different layouts and took surveys about the alternatives to find just the right one.  And she got further and and further behind on delivering revisions to development, her only direct customer of her work.  The team had gone from every other day deliveries of fixes and enhancements to the development databases to every other week.

Our modeler didn’t have time to learn the features of our target DBMS because was busy making the model a perfect data model. Our sprints were failing because database changes were coming too slowly and were often done incorrectly.  The DBAs and developers had to spend days cleaning up the DDL she had produced because she didn’t know how to use those features of our modeling tool.  So not only did she miss the deadlines, she missed the deliverables.  She didn’t understand this, though. In her mind, the data model was all anyone needed.  In reality, what they really needed was working DDL scripts to apply to their environments.   That was what good meant to the team.

What Format?

I worked with another data architect who would spend months working on the data model, then publish it inside a PDF of a Word document.  The actual data model file was not shared.  No DDL was shared.  Nothing was delivered that teams could actually import into their tools or apply to their local development environment.  This also meant that it was nearly impossible to comment on the data model or easily provide corrections or feedback.  The modeler wanted corrections to come to him in emails and he would make them and generate the Word documents and PDFs months later.  Requests for the data model file itself were met with "You don’t need that, just retype the model in whatever tool you use".  We can’t build stuff on PDFs.  Produce them, sure.  But they aren’t the deliverables we are looking for.  Your team (your customers) can tell you what format works best for them.  Just ask.

Where Do You Deliver the Deliverables?

I work with many data architects who don’t use the same deliverable locations as the rest of the project team and I think that’s a huge failure.  Professional development teams use some sort of versioning or configuration control environment to share their work product.  Data architects should use those, too.  Yes, we have our repositories and model marts, but those are typically only accessible by people who use the modeling tools.  They are my versioning control for the models, but I also deliver to the teams in the same place they expect to find all the components of their systems.  Maybe it’s just a wiki, or an intranet location. Where ever it is, I’m delivering there.

I worked with a data modeler who refused to use or learn the team’s versioning system.  So he just emailed around files, with no version numbers and expected people to be able to search through their emails to find scripts to deploy them in production environments. He seemed to randomly include and exclude people in the distribution list.  He often just attached a bunch of files, then said "here they are" in the body of the message.  Email is the worst versioning system in the world.  Don’t use it for deliveries.

Another approach I find annoying is fairly new, though.  I work with a an architect that keeps all the files he works on for several clients on Dropbox. All in one giant folder.  When he gets his work done,  he sends out an email saying: it’s all on Dropbox.  We as a team have to try  to figure out what the files are, which is the right version and try to ignore all the other stuff that sitting there in the folder.  As afar as I’m concerned, he might as well randomly generate his data models.

Work Backwards

If you have the answers to those questions above, you can work backwards to meet your deadlines.  My inefficient team members mentioned  above don’t do this.  They start with a generic to-do list of how to produce data models, start at step one and work their way as far as they can before they hit the deadline.  They often miss deadlines because they haven’t started at the end and worked backwards.  Don’t be that guy (or gal).  Know what you are expected to deliver, when, where, and in what format.

I start with what my deliverable is, what format I need to produce it in, and what measures I should use to ensure it’s good enough.  Then I start there with the tasks that will get me closest to that completion.  It’s only when those are completed do I keep working backwards to fill in the other high priority tasks.  If I still have time left, I do more.  I make it prettier.  I make it more useful.  I make it beautiful.  I love it.

Not all data architecture is done on development projects.  I understand that.  If your duties include supporting development teams, though, you need to support them.  That means loving your data models, the data AND databases.  There’s no reason why you can’t have it all.  Just remember which parts you’re supposed to deliver first.

Have you worked with data architects or modelers who worked backwards and got the job done?  What about people like I’ve mentioned in this post?  What would you wished they had done instead?

Get Hands On with SQL Server 2012 – Virtual Labs

Feb 3, 2012   //   by Karen Lopez   //   Blog, Data, Database, Professional Development, SQL Server  //  1 Comment

Want to get your hands on SQL Server 2012 right now?  You can be up and running without needing a server or to install SQL Server locally.  Just visit SQL Server Virtual Labs (use IE)  and choose a lab to get started.  The labs run in a virtual machine, so these aren’t just slides and a demo.  This is real, hands-on working with the tools.  Here’s your chance to get up to speed on SQL Server before the other 99%. Smile.

You can get real experience with new SQL Server 2012 features such as:

  • AlwaysOn, SQL Server’s new High Availability feature.  I have to say this is fun to play with.  Pull the plug and watch SQL Server gracefully fail over.  Like nothing happened at all.  Monitor the status of all the AlwaysOn components.  Very cool stuff here.
  • Master Data Services, SQL Server’s new feature for managing reference and master data.
  • Data Quality Services, a new feature for helping you Love Your Data even more.
  • Sparklines and Data bars, new data visualization features.
  • Columnstore Indexes, SQL Server’s new feature to make queries just fly.
  • Spatial and Location Services for mashing your data up with location based services.




While you are running the VM set up, register for the SQL Server 2012 Launch events coming in March in the US.  I hope to make it to one or two of those.


Subscribe via E-mail

Use the link below to receive posts via e-mail. Unsubscribe at any time. Subscribe to by Email



UA-52726617-1 Secured By miniOrange