Friday, December 19, 2008

Data Tools at EclipseCon 2009

Hey all!

Yes, the holidays are almost upon us... But even better than that, EclipseCon 2009 is just around the corner! (Ok, maybe not RIGHT around the corner, but three months will zip by in no time.)

This year in the Data Tools track we have a tutorial coming up and bunch of cool talks from a number of different directions.

Though I'm helping with the tutorial this year and moderating a block of short talks, I also am talking about how to use DTP to connect to something other than a database. After all, not all data is in databases!

My long talk this year will show how DTP and the Data Source Explorer can be used for searching and viewing YouTube videos in Eclipse. How do these two worlds meet? Come to my talk and find out!

Besides that, I get to moderate a set of three cool short talks. These bright folks will talk about how DTP is being used in RCP as a database developer's type of tool, what's going on with the SQL Query Builder in Galileo, and how DTP is being used in cool ways by a commercial application.

So be sure to check out some of our talks this year! In another post, I'll talk some about what to expect from our tutorial and some of the other cool talks going on at EC2009 that I'm excited about!

Register today and don't forget to reserve a room at the hotel (it fills up fast!)!

Happy Holidays to all!

Reblog this post [with Zemanta]

Tuesday, December 16, 2008

Out of Memory Errors...

True-color image taken by the Galileo probeImage via WikipediaHi all...

I recently spent about a day and a half chasing my tail trying to track down some bizarre out of memory errors I was running into with Ganymede SR1 when debugging some DTP stuff and thought I'd pass along what I learned. I was continually running out of heap space while debugging, which was very troubling.

At first, I thought I had somehow corrupted my workspace. So I created a new workspace and tried it again (several times actually) and ended up with the same problem.

Next, I thought I'd try the Galileo Platform Galileo M3 build (along with the associated GEF, EMF, and DTP builds) and see if I got the error. At first I didn't, and I thought 3.5 might have fixed the problem... [sigh] No luck.

So then I started poking around on Google, trying to figure out how to bump up my heap space.

I kept running across this suggestion (which after thinking about it was kind a "well duh" kind of thing):
"Add -Xms256M -Xmx512M to the VM arguments for the runtime configuration or on the main Eclipse command line when you start it up..."

Lo and behold it works now in Galileo and Ganymede. Life is good.

But it was a frustrating end to the week, only to discover that I was just oblivious to the simple solution. :(

Hope this saves someone else from the pain and suffering!

Reblog this post [with Zemanta]

Monday, December 15, 2008

New DB Support in DTP for Galileo...

The :en:SQLite logo as of 2007-12-15Image via WikipediaHey All!

Yes, it's been a while. But things are starting to hop with DTP for Galileo.

Our DTP 1.7 release will include some support for new databases and some updated support for existing ones...
  • Ingres has been kind enough to contribute their plug-ins for Ingres DB support in DTP for Galileo and we finally have them as part of our regular build.
  • I added some rudimentary SQLite support recently and that's now part of the regular build.
  • And Ivar and our friends at NexB were kind enough to contribute some big updates to our Microsoft SQL Server support, including new support for SQL Server 2008.
So big thanks to Ingres and NexB for making those contributions possible!

More changes are going in all the time as well... Enhancements and bug fixes mostly, but some new features as well.

Be sure to check out our M4 milestone build here for some early access!

Reblog this post [with Zemanta]

Friday, November 7, 2008

Get your submissions in for EclipseCon 2009!

Hi all!

Yes, it's just the first week of November... But November 24 is creeping up on us all too quickly... 18 days and will come and go and you'll be wondering where all the time went!

So it's time to think about a presentation for EclipseCon!

Simply want to share some cool tips or news with the world? Do a short talk! You only have to fill 10 minutes and it goes by like lightning.

Do you have more to say? Do a long talk! 50 minutes goes by very quickly.

Do you have too much to say in an hour and want to help people learn some new Eclipse skills? By all means try for a tutorial slot! Two hours of your very own to teach a few old (or new) developers some new tricks!

But don't wait too long! We'd love to see some DTP-themed talks. What cool things are you doing with Data Tools? Use it in new and twisted ways and share it with the rest of us!

Only 18 more days to get your proposals in!

Good luck and hope to see some talks soon!


Monday, September 22, 2008

Perceived Benefits of "Free" Open Source Software

Hi all...

Perceived benefits of Open Source involvement. I'm betting that nearly everyone in the Open Source community has wrestled with it at some point, either with customers, management, or both. And I have to admit, it was a bit of a shock to the system (in a good way) when I was introduced to the Eclipse community a few years ago. I didn't really have a good understanding of Open Source back then. And I may still not have a good handle on open source, but I'd like to think I know a little more than when I started.
German Weißbier
Open Source means different things to different people. Some people just see the word "free" and get all giddy. Others see it as an opportunity to spread the wealth a bit and help out the community. Frameworks are popular in this respect - just look at Eclipse, Apache, and SourceForge. Each has its own piece to the puzzle and companies and developers can take those bits and assemble them in cool and unique ways if they meet their needs.

But ultimately the "free" of Open Source is that it's "free" to use, modify, and redistribute within the scope of the license agreement under which it's distributed. Note that I didn't say it's "free" in what it costs to create or maintain.

Often you'll hear the phrase "free as in beer" not "free as in speech". Or gratis vs. libre. It takes time, money, and all the various ingredients for whatever it is you're putting together - whether that be virtual, like software, or physical, like beer.

Let me expand on that a bit. I like beer, so it's an easy analogy to expand on. :)

When the "beer" flows freely, it acts as more than just a social lubricant (those EclipseCon evening social events we do so enjoy). I think it actually greases the wheels of progress so individuals can get beyond the gears and widgets they may be stuck on and move on to higher levels of complexity. (For example, focusing on the design of the "car" and not the "nuts and bolts" required to put it all together.)

But in order for the "beer" to flow, somebody has to make it. Somebody must grow the hops and barley, secure water rights, acquire a facility to ferment and bottle the results, and so on. It takes effort to combine these ingredients, as well as time and money, into a nectar that can be shared to do all these wonderful things in an open community of ideas.

Each Eclipse project has its own brand of beer. And not everyone will be able to use every kind of beer that's available. Think of it as a brewery introducing a new wheat beer. I'm not a big wheat beer fan, but I can appreciate the care that goes into making it, and many of the processes involved are the same used for other beers, so there's a shared or at least similar set of ingredients that we can help with or at least support in this community of peers.

To continue the "beer" analogy, the Eclipse Foundation is as much a bottling facility and brew pub as it is an actual brewery. Its role is to help distribute the beer around the world, but also to gather communities and raise awareness so the beer doesn't stop flowing due to a lack of participation from those communities. Free beer does nothing for anyone if nobody knows about it and nobody drinks it.

So I see my role as the titular head of DTP (if only because we need a head to chop off should things get out of hand) to do three things at a high level...

  1. Represent Sybase's interests at Eclipse so we can continue making and drinking beer, whether it's a Sybase brand of beer or someone else's.
  2. Invite others to drink our beer so we start seeing more people drinking their way to newer and more exciting things that would then allow us to do bigger and better things as well.
  3. And make sure the beer continues to flow. Ingredients must keep coming to the brewery. New types of beer improve the richness of the overall production of the brewery, and sometimes we need to coordinate to help market and sell the beer to other distributors and markets to make sure the free exchange continues to perpetuate itself.
The grand goal is to continue to make the beer that we need to survive, but also allow open source, DTP, and Eclipse to expand and grow in surprising ways.

I'm constantly amazed at the breadth of products and projects that are beginning to adopt and use DTP for their own purposes. Every year, we add more great people and companies to the mix. So we need to continue to nurture and grow DTP to continue building awareness and adoption of our beer so that the immediate community as well as our customers are aware of our efforts.

So drink up... I'd like to see us making and drinking DTP beer for a long time to come. :)


Reblog this post [with Zemanta]

Thursday, August 28, 2008

Creating the SQLite Connection Profile UI bits

Hi there!

So now we're almost to our first functional version of a SQLite connection profile in DTP. We have a driver-wrapper plug-in, a driver definition, an overridden catalog loader, and a connection profile with its associated connection and connection factory classes. What's next? Why, adding the UI so you can create the profile, of course!

With versions of DTP prior to Ganymede, this was a more difficult, but not horrible task. In Ganymede, we've reduced the amount of work to adding three extension points and writing four key classes in a org.eclipse.datatools.enablement.sqlite.ui plug-in (basically just extending these classes a tad, so little real work involved):
  • A connection profile wizard extension that uses the org.eclipse.datatools.connectivity.connectionProfile extension and uses the newWizard node so we can define our wizard
  • A property page extension (org.eclipse.ui.propertyPages) to define a property page so we can edit our SQLite conneciton profile instances
  • A driver UI contributor extension (org.eclipse.datatools.connectivity.ui.driverUIContributor) to create a reusable UI component that gathers the information we need for our SQLite connections
  • A connection profile wizard class that extends the org.eclipse.datatools.connectivity.ui.wizards.ExtensibleNewConnectionProfileWizard class
  • A connection profile wizard page that extends org.eclipse.datatools.connectivity.ui.wizards.ExtensibleProfileDetailsWizardPage
  • A connection profile property page that extends org.eclipse.datatools.connectivity.ui.wizards.ExtensibleProfileDetailsPropertyPage
  • And a driver UI component that is used on the wizard and property pages that implements org.eclipse.datatools.connectivity.ui.wizards.IDriverUIContributor
It may look daunting, but really it boils down to a few extension points, a few extended classes, and an instance of poor man's inheritance (copying a class from another project).

So let's get started!

DTP's Been Babel-ized!

Rendering in Klingon gliphs of the word Qapla'...Image via Wikipedia Hi all!

Just thought I'd share some cool news from DTP land. I received word from Denis (via Bugzilla) that the initial translations for DTP's 1.6 (Ganymede) release have been added to Babel! Yay!

I would like to thank the folks who helped get us going with translations in Babel:
  • Antoine Toulmé
  • Yasuo Doshiro
  • Denis Roy
  • and Kit Lo
So thanks to everybody who helped get us started. Pretty soon we'll be able to have DTP in Klingon (probably not, but it's an interesting concept at any rate!)!

Reblog this post [with Zemanta]

Wednesday, August 20, 2008

Creating an Actual SQLite Connection Profile (minus the UI)

Hi there!

So now we have the majority of our work done. We have a driver-wrapper plug-in, a driver definition, and an overridden catalog loader. What's next? Wrapping the functionality in a nice, easy to use connection profile!

Note: Previous articles in this series cover the following topics: Catalog Loaders, Driver Templates, and the Driver Framework.

Many moons ago, we talked at a high level about the driver template & driver definition frameworks. It's now time to talk briefly about the connection profile framework.

It all boils down to this... A connection profile manages a connection to something. Right now in DTP we connect to JDBC databases and file systems for the most part. But the Sybase WorkSpace product also uses DTP to connect to application servers, LDAP, UDDI repositories, and much more. So it's not limited in any way.

With that in mind, a JDBC database connection profile, such as the one we want to create for SQLite, just needs to manage a JDBC connection under the covers. We'll add a layer on top of that to attach the SQL Model to the connection so we can display the database specifics in the Data Source Explorer tree.

In the DTP Ganymede (1.6) release, we've really simplified creating a new connection profile if it's associated with a db definition vendor/version and a driver template. So we'll take advantage of that for SQLite.

To create a connection profile, we will go to the org.eclipse.datatools.enablement.sqlite plug-in project and create a couple of classes and two extension points. These steps are kind of chicken & egg - the order isn't really important so long as you get them all done.

Step 1: Create a new connection factory and connection class for SQLite. These are the actual raw connections that our SQLite connection profile will manage for us.

The connection class is pretty easy. We're just going to extend the Generic JDBC JDBCConnection class for SQLite so we have our own specialized version of it.

That code looks like this:
package org.eclipse.datatools.enablement.sqlite.connection;

import org.eclipse.datatools.connectivity.IConnectionProfile;
import org.eclipse.datatools.connectivity.db.generic.JDBCConnection;

public class SQLITEJDBCConnection extends JDBCConnection {

* @param profile
* @param factoryClass
public SQLITEJDBCConnection(IConnectionProfile profile,
Class factoryClass) {
super(profile, factoryClass);

The connection factory requires a little more work, but not much more:
package org.eclipse.datatools.enablement.sqlite.connection;

import org.eclipse.datatools.connectivity.IConnection;
import org.eclipse.datatools.connectivity.IConnectionProfile;
import org.eclipse.datatools.connectivity.db.generic.JDBCConnectionFactory;

public class SQLITEJDBCConnectionFactory extends JDBCConnectionFactory {

public SQLITEJDBCConnectionFactory() {

public IConnection createConnection(IConnectionProfile profile) {
SQLITEJDBCConnection connection = new SQLITEJDBCConnection(profile, getClass());;
return connection;

Basically in the connection factory, we're just creating one of our new SQLiteJDBCConnection class instances for the profile that's passed in.

Step 2: We want to add a new extension point to the plugin.xml in the org.eclipse.datatools.enablement.sqlite plug-in project: org.eclipse.datatools.connectivity.connectionProfile.
This extension point has a couple of nodes we're going to create beneath it: connectionFactory and connectionProfile.

Let's define our connectionProfile first, so we have the connection profile ID to use for the connectionFactory.

You can see from the screen that we're giving our connection profile the following properties:
  • id = org.eclipse.datatools.enablement.sqlite.connectionProfile
  • category = org.eclipse.datatools.connectivity.db.category (this ensures that our connection profile shows up under the "Databases" category in the DSE)
  • name = SQLite Connection Profile
  • icon = icons/jdbc_16.gif (you can copy this from the Generic JDBC connection profile plug-in)
  • pingFactory = org.eclipse.datatools.enablement.sqlite.connection.SQLITEJDBCConnectionFactory (our new connection factory class we created in step 1)
Then we define our connectionFactory:

Our connectionFactory extension has the following properties:
  • id = java.sql.Connection (this maps to the type of connection this connection factory/connection class maps back to -- in this case, a JDBC connection)
  • class = org.eclipse.datatools.enablement.sqlite.connection.SQLITEJDBCConnectionFactory (our connection factory class)
  • profile = org.eclipse.datatools.enablement.sqlite.connectionProfile (our SQLite connection profile ID from the connectionProfile extension)
  • name = SQLite Connection Factory
So now we have a connection profile for SQLite in DTP. Now all we need is a user interface (wizard, wizard page, property page, and driver UI) and we'll be golden!

That's what we'll cover next time.

Reblog this post [with Zemanta]

Tuesday, August 19, 2008

Eclipse's e4 vs. Microsoft's e7

Hey all...

This is slightly off topic... But I just saw an interesting post over at PC Magazine's site... Evidently the Microsoft Windows 7 team has created a new blog (you can see it here) and the shorthand term for Windows 7 is e7.

Isn't it kind of odd in the year when e4 is getting ramped up for Eclipse that Microsoft would also have their own "e" term for a release? I guess there *are* only 26 letters in the alphabet... So there's a *chance* (small, but there) it was random...

Anybody else have any thoughts?
Reblog this post [with Zemanta]

Thursday, August 7, 2008

DTP SQLite support continued... on to Catalog Loaders...

Hi all...

Yes, it's been a while since I wrote the last article in this series. I apologize for that. This summer has been busy both at work and home (not like I got to go on a cruise like Ed Merks or anything, but we've been bouncing around!). So the series of articles fell by the wayside a bit.

That said... Let's get back to it. When we last left our intrepid coders, we were working through the steps of trying to get SQLite to connect and show its underlying model (schemas/tables/procedures, etc.) in the Data Source Explorer. We had just finished creating some driver templates (see that article here) and that didn't really buy us all that much.

The next step is to then create a custom catalog loader to take care of any shortcomings of the SQLite driver. We have a db definition (vendor/version) to hang the catalog loader from, which means we just have to choose which level to focus on first.

In this case, since we're interested in schemas as the highest level of the model for SQLite, we'll override the schema catalog loader.

To do this, we need to do a things.

1) In the plug-in manifest editor (opened either from MANIFEST.MF or plugin.xml) for the org.eclipse.datatools.enablement.sqlite plug-in project, we need to add a new dependency. Select the Dependencies tab in the manifest editor and add org.eclipse.connectivity.sqm.core. This will provide one of the extension points that we need to extend to override the catalog loader.

2) On the Extensions tab, add a new extension and select org.eclipse.datatools.connectivity.sqm.core.catalog. Right-click on the extension and select "overrideLoader". Then select the "overrideLoader" node in the extension tree. You should see something like the following:

Product and version equate directly to the db definition that we've already created. Provider is the actual loader class we'll override the default with. And eclass is the SQL model class that we want to override the loader for our SQLite databases.

In this case:

* product = SQLITE
* version = 3.5.9
* eclass = org.eclipse.datatools.modelbase.sql.schema.Schema
* provider = a new class we'll create called SQLiteSchemaLoader

The SQLiteSchemaLoader looks something like this:

package org.eclipse.datatools.enablement.sqlite.loader;

import org.eclipse.datatools.connectivity.sqm.core.rte.ICatalogObject;
import org.eclipse.datatools.connectivity.sqm.loader.IConnectionFilterProvider;
import org.eclipse.datatools.connectivity.sqm.loader.JDBCSchemaLoader;

public class SQLiteSchemaLoader extends JDBCSchemaLoader {

public SQLiteSchemaLoader(ICatalogObject catalogObject,
IConnectionFilterProvider connectionFilterProvider) {

super(catalogObject, connectionFilterProvider);


Since SQLite has no concept of a "schema", we need to dummy one up so that the model is satisfied. (Yes, this is one more case where the loose JDBC "standard" bites us in the rear when we try to adhere to it.) To do that, we really only need to focus on overriding a couple of key methods:

* protected void initialize(Schema schema, ResultSet rs) throws SQLException
* public void loadSchemas(List containmentList, Collection existingSchemas) throws SQLException

In the default JDBCSchemaLoader, it relies on the driver to provide a result set of schemas. In the SQLite case, since there are none, we need to change the behavior to just dummy up a schema object and pass it along.

So loadSchemas becomes:
public void loadSchemas(List containmentList, Collection existingSchemas)
throws SQLException {
Schema schema = (Schema) getAndRemoveSQLObject(existingSchemas,
if (schema == null) {
schema = processRow(null);
if (schema != null) {
else {
if (schema instanceof ICatalogObject) {
((ICatalogObject) schema).refresh();

And initialize is changed to ignore the result set and just name the schema "DEFAULT":
protected void initialize(Schema schema, ResultSet rs) throws SQLException {

The last thing we have to do is make our catalog loader class actually run as an executable extension. If we don't add the following constructor, you get an InstantiationException, which is always a pain to track down:
public SQLiteSchemaLoader() {

Once the dummy schema is in place, the driver actually does return the tables list correctly, as seen in the following screen shot.

However... It appears that the SQLite driver does not return getImportedKeys() directly (it throws a "not yet implemented" SQLException for me), so we will need to clean up the JDBCTableConstraintLoader before we're done, but we can do that during code cleanup. (We'll also have to remove some of the nodes that don't make sense for a SQLite database, such as Authorization IDs, Stored Procedures and User-defined Functions.)

So now we have a (mostly) working catalog loader that will connect to a SQLite database and allow us to drill in and see tables and columns.

In the next article we'll walk through the simplified process for creating a brand new connection profile (wizard, wizard page, property page, connection factory, and connection classes). And then we can talk about the clean up phases and some of the nice things we can do to help out our users and developers.

Hope that helps!

Zemanta Pixie

Tuesday, July 22, 2008

Ever wonder how to get a Database from a Connection Profile?

Hi all!

Have you ever wondered how to get a DTP SQL Model Database object from a connected connection profile? I seem to run into this problem infrequently, but always have to go through many gyrations to find the answer.

Larry from IBM, another member of the Connectivity team, was kind enough to provide the answer. It resulted in this chunk of code:

public Database getDatabaseForProfile (IConnectionProfile profile) {
IManagedConnection managedConnection = ((IConnectionProfile)profile).

if (managedConnection != null) {
try {
ConnectionInfo connectionInfo = (ConnectionInfo) managedConnection.
if (connectionInfo != null) {
return connectionInfo.getSharedDatabase();
} catch (Exception e) {
return null;

(Sorry the code's not pretty. I haven't found a great way of including code in Blogger blog posts yet.)

Basically under the covers there is a ConnectionInfo connection adapter that is used to map between the java.sql.Connection object we get from JDBC and the SQL model that's populated via the catalog loaders.

So there you have it! Not that many people have a need for such a thing, but it's handy just in case!


Thursday, July 17, 2008

How are YOU using DTP?

Hi all!

Now that the Ganymede release has gone out the door, DTP 1.6 has been released into the wild.

We in the DTP project would like to know how you are using DTP -- either as an end user, an adopter, or an extender. Are you using it to help with day to day database development tasks? Are you integrating with it from different projects in Eclipse such as BIRT, WTP, or JPA?

Though we're working on our first maintenance release for September, we're also starting to plan our next major release due in June 2009 along with the rest of the Eclipse Release Train. So we want to know what you would like to see in the next major release.

What do you like? What don't you like? We're here to help our community grow and develop. But to help us do that, we need guidance from the very folks who are using our stuff or looking at using it.

Let us know! Either by leaving a comment on this blog entry or by posting a message to the DTP newsgroup or mailing list!


Friday, June 27, 2008

New DTP Ganymede Video at Eclipse Live

Hi all!

Just thought I'd let you know as part of the Ganymede release, Ian had most of the main projects create 15-20 minute videos to include on Eclipse Live to highlight what's new and cool in Ganymede.

Well, I'm happy to say that the DTP demo is live. :)

You can take a look at it here.

Please be gentle, as it's my first live demo video. :)


Wednesday, June 25, 2008

DTP T-shirts... Get your t-shirts here...

Hey all...

In honor of the Eclipse Ganymede release and DTP's 1.6 release, I put together a t-shirt design so you can share your DTP love with the world. It's not much, but it's colorful and after one revision yesterday (thanks Linda!) I think it includes a majority of DTP projects, components, and terms in a creative way.

Pick one up to share with your friends and coworkers. :)

Thanks to everyone involved with DTP's Ganymede release!


Tuesday, June 24, 2008

Eclipse Ganymede is Released Into the Wild! (Repost)

Hi all...

It should be common knowledge, but Eclipse Ganymede (also known as Eclipse 3.4) is being released today! As said by the Eclipse folks (thanks Ian for writing a great press release as always) - "The Ganymede Release is a coordinated release of 23 different Eclipse project teams and represents over 18 million lines of code." It's an enormous undertaking by a diverse group of people, projects, and companies, so kudos to everyone involved!

Along with Ganymede is the next major release of DTP (version 1.6). With this release, we add some new functionality and streamline some of what was already there...

One of the things that people have asked for over the last few releases has been a graphical SQL query editor. And it's finally there! Yay!

There is still a bunch of work left to do with it, but for a first release I think it works great. I'm not a big fan of hand-coding joins in SQL statements and this makes it VERY easy to do that. Just click and drag!

Beyond that, we did a ton of work on usability in DTP 1.6 also. We've streamlined the process for creating driver definitions and connection profiles to the point where it only takes a few clicks before you're connecting to and drilling into the Data Source Explorer to view your databases. So a big thank you goes out to Max at JBoss and the Zend folks and everyone else who provided valuable feedback during our prototyping and implementation of these changes.

It's been a heck of a year, but I think DTP is better and stronger than ever. We will get started shortly on our first maintenance release (1.6.1, due out in September) and planning for the next major Eclipse release (in June 2009).

As always, we depend on the community for support and guidance going forward. As you start to explore our Ganymede release, keep an eye to the future and let us know what you'd like to see us do by next June!

Thanks to everyone involved in the DTP 1.6 release -- from Sybase, Actuate, IBM, JBoss, Zend, Ingres, and all those who I can't remember at the moment. Everyone take a bow as the curtain drops on Ganymede, enjoy a brief rest, and then it'll be time to get going again!


How do you add your own custom driver template? (Repost)

Hi all...

Sorry there's been a bit of a lag between articles. We've been busy trying to get Ganymede out the door and start planning for the future (maintenance releases for 1.6 plus the next major release of DTP for next June). I need to ditch my crystal ball for a magic eight ball I think. :)

Anyway... This week we're going to chat about how to create a new custom driver template.

First of all, when would you want to do this? There are a few possibilities:

  1. You're creating support for a new database type not currently covered by DTP Enablement.
  2. You want to add support for a third-party driver (such as DataDirect or jTDS) for a currently supported database.
  3. You simply want to add an alternative driver template to complement an existing driver that adds properties or changes default values for use in your application(s).
That said, let's pick #1. We can use it as an example to provide functionality through this article series and eventually add some new database support to DTP Enablement.

For this example, let's work on SQLite support.

You can find a ton of information about SQLite on the SQLite home page ( And you can grab the SQLite JDBC driver from the SQLiteJDBC page ( So we'll grab the SQLite binaries for Windows (in my case) and the sqlitejdbc-v051-bin.tgz for this case. (You'll need to put the sqlitejdbc.dll in your JRE's or SDK's JRE bin directory to get this working.)

Typically the process I work through when deciding whether or not we need a custom connection profile for a given database is as follows:
  1. Can I create a new Generic JDBC driver definition that references the jar (sqlitejdbc-v051-native.jar in this case)? Yes.
  2. Can I then create a new Generic JDBC connection profile that uses my driver definition from (1) to connect to the database? Yes.
  3. Can I browse into the database to see schemas, tables, stored procedures, and the like? Unfortunately not in this case.

This means we need to go a step further and go through these stages:
  • Stage 1: Create a new Database Definition for our Database
  • Stage 2: Create a new Driver Template for our Database (and the associated UI)
  • Stage 3: Create a new Connection Profile for our Database
  • Stage 4: Create a Custom Catalog Loader for our Database
So let's start with Stage 1 and get Stage 2 started today...

For each database that is supported in DTP and presents its structure in the Data Source Explorer (DSE), we have to tell the base models what the database supports. This is represented by the Database Definition (or "DB Definition"). What data types does it handle? Does it handle aliases or triggers? What kinds of constraints?

The DB Definition file itself is simply an XMI file (an XML file that provides metadata for some other XML files). In this case, it maps back to an EMF model for the DB Definition.

I'm not going to go into the gory details here. But there's a good article on how to get started here (be sure to look at Scenario 2).

Basically we need to create an XMI file to tell DTP what basic properties this database adheres to. What data types does it support, does it have catalogs, and so on. Most of this information can be found in the database documentation.

To simplify the process a little, we have a sample Java file that can be customized to create a new XMI file. I've modified it somewhat to create the XMI file locally. And I'll post a zip with the necessary files on the DTP website so you can grab them at the end of this exercise.

Once that's created, you can create a plug-in wrapper called "org.eclipse.datatools.enablement.sqlite.dbdefinition". And in that plug-in wrapper we will use the org.eclipse.datatools.connectivity.sqm.core.databaseDefinition extension point to tell DTP about it. Basically the databaseDefinition extension point just maps the XMI file to a named vendor and a named version. That's how the underlying systems will locate it. (You'll see the terms vendor and version appear later as we define our driver template as well.)

So now we have a DB Definition and a plug-in wrapper for it. Cool. Now we can move to the first part of Stage 2: Creating a driver template.

To create a new driver template, we'll start by creating another plug-in. This plug-in will house all the non-UI bits and pieces we want for our SQLite connection profile. We'll call it "org.eclipse.datatools.enablement.sqlite".

In the manifest for our new SQLite plug-in, we will create a new org.eclipse.datatools.connectivity.driverExtension extension. This extension point is used to register driver template categories and driver templates within the DTP framework.

Remember how we were talking about vendor and version earlier? Well, now we're going to map some driver template categories to them.

First we'll create a "SQLite" category, which maps back to the vendor name we chose earlier and has org.eclipse.datatools.connectivity.db.driverCategory as a parent. All database drivers fall under this category in DTP so we can easily find them. We'll call our new category "SQLite" and give it an ID "org.eclipse.datatools.enablement.sqlite.driver.category".

Next, we'll create a "3.5.9" category, to map to the version we selected earlier (3.5.9 is the most recent version of SQLite I could find). This one will use our "SQLite" category as its parent. We'll call it "3.5.9", and give it an ID "org.eclipse.datatools.enablement.sqlite.3_5_9.category"

Lastly we'll create the driver template itself. We'll give it the name "SQLite JDBC Driver" (not very original, but easy to remember) and an ID "org.eclipse.datatools.enablement.sqlite.3_5_9.driver". We'll set it to our SQLite 3.5.9 parent category so it has some context, setting it to the category ID we made a second ago "org.eclipse.datatools.enablement.sqlite.3_5_9.category".

We know it needs a driver jar, so we'll provide a default jar name as "sqlitejdbc-v051-native.jar". If we get fancy later, we can provide some mechanisms to pre-populate the path to the local version of that jar, but for now we'll assume the user will be able to know where their jar is located and set it appropriately in the driver definition. (Yes, we'll talk about the "fancier" way to do this automatically later.)

Beyond that, we need to get a few key bits of information about the driver. Based on the documentation for SQLite, it appears that we require the following property values:
  • Driver Class: org.sqlite.JDBC
  • JDBC URL: jdbc:sqlite:test.db
Easy enough, right? Well, we also require a few other things for a standard driver definition:
  • Vendor: SQLite
  • Version: 3.5.9
  • Database name: TEST (we can extrapolate this from the sample URL)
  • User ID: (not applicable, so we just leave it blank)
With these basic bits and pieces, we have defined our driver template! Whew. Took a bit of work though, I know.

That said, we now have reusable bits we can take into the next part of this process, which is creating a connection profile that can use our new driver definition and driver template.

At this point we're just laying the ground work. You can find a zip with the plug-ins created during this exercise here.

So next time we'll look at creating a basic connection profile that can actually use these bits!


The DTP Driver Framework (Repost)

Hi all...

In this article, we're going to cover an introduction to the DTP Driver Framework. What are Driver Templates/Driver Definitions? What do we do with them? Where does the UI fit in? Where does the properties provider fit in? Can I make a Driver Template into a hat? Where does the Database Definition fit in?

All of these questions and more will be answered by the end of the article, I hope.

So let's start with an easy one. What is a Driver Template? A Driver Template is a named collection of properties needed to define a usable Driver Definition.

A Driver Definition is a real instance of a Driver Template, with specific paths to driver jars or particular information to help users of that driver to create a Connection Profile that uses it.

We'll cover Connection Profiles in another article, but note that Driver Templates and Driver Definitions are optional for Connection Profiles. We use them for JDBC Connection Profiles because they provide an easy way to manage drivers and driver jars without duplicating that information in each Connection Profile instance.

Basically we have ([] = optional):

[Driver Template -> Driver Definition] -> Connection Profile

Each Driver Template can fall into one parent category, but categories can be split up in many ways. For example, we have a Database category, and then break that down by Vendor (IBM, Sybase, etc.) and Version (DB2 8.1, ASE 15, etc.). These categories provide groupings for Templates so we can refer to a parent category of Templates in particular controls used for Connection Profile creation/editing - like the DriverListCombo. You can give the DriverListCombo a category ID (like the one for Derby driver templates for example) and it will grab all of the Driver Definitions that use Driver Templates that fall under that category or any of its child categories.

(Brief sidebar... In Europa and before, we used to display Driver Definitions on the Driver Definitions preference page in a tree that followed this scheme... Databases -> Vendor -> Version -> Driver Definition, but now in Ganymede we arrange this differently so it's easier to sort by vendor or version or category to get at what you're looking for.)

Think of a Driver Template as a basic set of properties needed for a driver. Driver Templates require a unique ID, a name, and a parent category. Though jar lists are required for managing JDBC drivers, they're not mandatory for all Driver Templates, so you can specify if an empty jar list is ok. You can also provide a default name for Driver Definitions that use the Template as well as a class that can programmatically provide values for properties when Driver Definitions are created.

In addition, you can provide a list of other properties for the Driver Template. It's quite customizable. For JDBC templates, we require:
  • Driver Class
  • Vendor
  • Version
  • Database Name
  • Connection URL
  • User ID
  • Password
As an example, the Derby Embedded JDBC Driver template for version 10.0 of Derby has the following basic properties:
  • id = org.eclipse.datatools.connectivity.db.derby.genericDriverTemplate
  • name = %DERBY_EMBEDDED_DRIVER_TEMPLATE_NAME (which resolves to "Derby Embedded JDBC Driver")
  • description = (but this may be used in the future in the Driver Definition UI as a tooltip)
  • parentCategory = org.eclipse.datatools.connectivity.db.derby.10_0.driverCategory, which is a child category of org.eclipse.datatools.connectivity.db.derby.driverCategory, which is a child category of org.eclipse.datatools.connectivity.db.driverCategory (so you can see the hierarchy - all Database drivers fall under "org.eclipse.datatools.connectivity.db.driverCategory", and then Derby further defines it to a "Derby" category and then a category for "10.0" beneath that)
  • jarList = derby.jar (multiple jars could be specified here, and this property can be further modified by the valuesProvider class)
  • createDefault = false (indicates whether we should create a default Driver Definition for this template when a new workbench is created for the first time)
  • emptyJarListIsOK = false (indicates that a jar list is required for this Derby 10.0 template)
  • valuesProvider = org.eclipse.datatools.connectivity.apache.internal.derby.driver.DerbyDriverValuesProvider101 (this is a class that implements the IDriverValuesProvider interface and is used in the Derby case to see if one of two JDBC driver file plug-in wrappers exists in the workbench - if it does, it will create a Driver Definition by default, changing the jarList to have a valid path to the jar and setting createDefault to true)
  • defaultDefinitionName = %DERBY_EMBEDDED_DRIVER_DEFAULT_INSTANCE_NAME (resolves to "Derby Embedded JDBC Driver 10.0" and is used if the valuesProvider finds the right plug-ins and creates a default Driver Definition)
Then when a Driver Definition is instantiated for that Driver Template, it uses the values from the Driver Template as defaults and allows the user to modify them. Think of this as a concrete instance of a template. Sort of like a Microsoft Word template used to create a Microsoft Word document. The template just suggests some defaults for the document, but the document is the actual file being modified.

So with our Derby example, we create an instance of the Derby Driver Template that takes those defaults and allows the user to modify the values to make the Driver Definition "valid". This means in effect that a) any jar files specified in the jar/zip files list are accessible and b) any required properties have values. We point to a concrete path for our derby.jar file, maybe modify the default JDBC url and user name for our basic installation, and we're on our way.

As we mentioned earlier, if the valuesProvider class finds an appropriate plug-in wrapper for the Derby driver, it will create the Driver Definition by default and the user won't have to modify it unless they need to tweak the defaults for their installation.

Once we have a Driver Definition defined for a given database type, we can then create a Connection Profile that uses the properties of the Driver Definition as defaults (jar list and driver class are shared between the Driver Definition and the Connection Profile), and we can specialize our Profile details further, perhaps adding some optional properties to the url, a unique port or database name/path, or providing a more specific user name/password combination.

Now... I mentioned in my introduction last week that you can do more with Driver Templates and Driver Definitions than just use them in database Connection Profiles.

What if you want to define some basic communication protocols for a particular Connection Profile to consume? For example, you might define an "e-mail" driver that doesn't have any jars, but defines the basic properties of an SMTP server vs. a POP3 server vs. some other e-mail server type. So you could have two or three Driver Definitions that get consumed by an e-mail Profile (nobody's written one yet, want to volunteer?) to access and display e-mail sitting on a server somewhere.

Or maybe you want to use DTP to help manage connections to application servers? There's a wide variety of those... JBoss or Tomcat or EAServer or any number of others... Why not come up with a common Connection Profile that consumes "application server" Driver Definitions to show... message queues or provide administration functionality to start/stop a server, or view the log for the server.

Perhaps you're playing with social networking sites and you want to come up with a common set of functionality across them... Would you be able to perhaps use OpenSocial with DTP to provide that functionality?

There are a ton of possibilities to explore.

Think of Driver Templates and Driver Definitions as helpers for Connection Profiles. They're not required, but they can provide a place to share common details among profiles so you don't have to duplicate it in a bunch of profiles. For example, what if you have a bunch of database Profiles that don't use the Driver Framework and you want to update your driver class/jar list for a new version of the JDBC driver? Depending on how you implemented it, your user might have to go into each and every Profile that references the old files/driver class and update them by hand. Or if you use the Driver Framework, you update it in one place and the very next time you connect, it gets picked up automatically by the Profile.

So what do you think? Can you think of places where it would be handy to use DTP's Connection Profile and Driver Frameworks to meet a need?

As always, holler if you have questions or think I'm making this stuff up as I go along. I value the input of the community. There's LOTS of room to explore to see where we can take the DTP frameworks and I'm happy to be along for the ride!

Next week we'll start talking about how to implement your own custom Driver Templates.
Until next time...

DTP Article Series Introduction (Reposted)

Hi all...

I am going to be writing about some of the cool things you can do with the Data Tools Project (DTP) and how you can set up your own connection profiles for a variety of purposes. My goal with this series of articles is to provide a baseline for more people to understand the DTP frameworks for connectivity so they may build on them or use them in any way they see that's useful.

DTP is more than databases. Anyone who's played with BIRT realizes that ODA, which is a part of DTP Connectivity, shows the potential for accessing a variety of data sources, from comma separated files and XML to the variety of databases we support today through the DTP Enablement project. Though SQL is cool, it's nice to have the ability to have a consistent interface to your data when you're working in Eclipse.

So I'm going to break this into a series of articles and try to post fairly regularly -- probably one a week or so -- to cover the various bits and pieces of the DTP Connectivity frameworks and how a developer could use these bits and pieces in their own tools.

We're going to cover Driver Templates and Driver Definitions first, which might seem a bit odd considering what I said just a couple of paragraphs ago about DTP being more than just databases. However, Sybase has used driver templates in a few ways beyond just for JDBC drivers. We used them to describe properties of a security specification that could then be expanded on by adopters and users for their own purposes. Templates can be anything you want them to be and we'll cover some possible uses of the driver framework in the first series of articles.

Once we've covered drivers, we'll start talking about catalog loaders. Unlike drivers, catalog loaders are definitely a database-specific thing at this point, leveraging the Database Definition and SQL Model from the DTP Model Base project to populate a rich EMF model with JDBC-related information about a particular database. This allows you to do a variety of cool things with the SQL model.

And lastly, but definitely not least, we'll then cover the connection profile framework. Here's another area, like the driver framework, where you're not limited to databases. A connection profile can connect to a JDBC database, a CSV file, an XML file, or whatever you'd like to browse. By integrating with the Platform's Common Navigator Framework, you can integrate with file systems or FTP providers, or application servers, or newsgroups if you want to make a newsgroup reader -- pretty much anything you can connect to and pull back information from, you can make a connection profile for.

So as you can see, there's a lot of functionality available in DTP. I'll do what I can to help write about the various aspects of DTP Connectivity that you may not have known about.

As always, if you have questions, post them in the comments, post them on the newsgroup, post them on the mailing list -- wherever it's convenient for you to do so. And we'll do our best to get them answered in a timely manner.

Thanks for your time! Next week we'll start talking about the Driver framework.


DTP Blog, Take 2...

Hi all...

You might have noticed some funky posts hogging the top spots on Planet Eclipse over the weekend. Sorry about that. The blogging software that Sybase is using isn't the greatest. Though it allowed me to schedule posts for the future, it didn't afford the same functionality to the RSS feeds, which messed everybody up. (Sorry Denis, Gunnar, and Ian).

However, after talking a bit with the folks handling the blogs, it was decided I should just create a separate blog in Wordpress or Blogger... Though I like Wordpress better (I use it for my personal blogs), it was having issues with getting a new free account created -- so here I am at Blogger!

Anyway... I feel bad about the switch, but hopefully this means you won't have to see my posts over and over and over again. I'm sure once is enough on Planet Eclipse for everybody. :)

I'm going to repost my articles so far on the Driver Framework, but promise I won't go any further than that.

Thanks for your patience!