Click here to Skip to main content
15,880,427 members
Articles / Programming Languages / C#
Article

BindingSource, Transaction Sandboxes, and Pre vs. Post Add Modalities

Rate me:
Please Sign up or sign in to vote.
5.00/5 (12 votes)
13 Mar 2006CPOL12 min read 75.6K   762   52  
An investigation into different data entry modalities and the need for a transaction sandbox.

Introduction

This is the third article on the series discussing DataTable transactions. The first two are:

In this article, I want explore the new .NET 2.0 BindingSource class and the complexity between pre/post add operations. But first...

Refactorings

  • Fixed the spelling of "uncommitted".
  • Added a RowAdded event to the transaction logger that fires when the row is actually added to the DataTable's row collection.
  • The DataTableSynchronizationManager now remembers the first non-sync'd record added to the transaction log. This prevents re-syncing on previous transactions.
  • SynchronizationManager now supports passing PK values into an overloaded GetTransactions method.
  • TransactionRecordPacket can be initialized with PK values supplied externally.

Pre-add

A typical UI has an "Add" button that brings up a UI allowing the user to enter the various fields. Clicking on OK typically calls the DataTable's AcceptChanges method or, if using the BindingSource class, the EndEdit method. If the user clicks on Cancel, the RejectChanges or CancelEdit method can be called. For example, given the following object graph initialization:

A DataTable:

XML
<data:DataTable def:Name="dataTable" TableName="PersonInfo">
  <Columns>
    <data:DataColumn ColumnName="PK" DataType="System.Guid"/>
    <data:DataColumn ColumnName="LastName" DataType="System.String"/>
    <data:DataColumn ColumnName="FirstName" DataType="System.String"/>
    <data:DataColumn ColumnName="Address" DataType="System.String"/>
    <data:DataColumn ColumnName="City" DataType="System.String"/>
    <data:DataColumn ColumnName="State" DataType="System.String"/>
    <data:DataColumn ColumnName="Zip" DataType="System.String"/>
  </Columns>
</data:DataTable>

and in code (because I'm lazy and don't want to write an extender to handle the PrimaryKey array-gads, I wish there was some consistency in how collections are handled in .NET):

C#
dataTable.PrimaryKey = new DataColumn[] { dataTable.Columns["PK"] };

The DataView:

XML
<data:DataView def:Name="dataView" Table="{dataTable}" 
                           Sort="LastName, FirstName"/>

The BindingSource:

XML
<BindingSource def:Name="bindingSource" DataSource="{dataView}"/>

The TransactionLogger:

XML
<cd:DataTableTransactionLog def:Name="dataLog" 
        SourceTable="{dataTable}" 
        TransactionAdded="{app.OnTransactionAdded}" 
        OnRowAdding="{app.OnRowAdding}"/>

And the input form with the necessary data bindings:

XML
<?xml version="1.0" encoding="utf-8"?>
<!-- (c) 2006 Marc Clifton All Rights Reserved -->
<MyXaml xmlns="System.Windows.Forms, System.Windows.Forms, 
                         Version=2.0.0000.0, Culture=neutral, 
                         PublicKeyToken=b77a5c561934e089"
  xmlns:data="System.Data, System.Data, Version=2.0.0000.0, 
               Culture=neutral, PublicKeyToken=b77a5c561934e089"
  xmlns:ctd="Clifton.Tools.Data, Clifton.Tools.Data"
  xmlns:cwf="Clifton.Windows.Forms, Clifton.Windows.Forms" 
  xmlns:cd="Clifton.Data, Clifton.Data" 
  xmlns:def="Definition"
  xmlns:ref="Reference">
  <Form Name="PersonInfoDlg"
    Text="Person Info"
    ClientSize="570, 200"
    MinimizeBox="false" 
    MaximizeBox="false"
    StartPosition="CenterScreen"
    AcceptButton="{btnOK}"
    CancelButton="{btnCancel}">

    <Controls>
      <Label Location="10, 50" Size="100, 15" Text="Last Name:"/>
      <TextBox Location="10, 65" Size="200, 20">
        <DataBindings>
          <cwf:DataBinding DataSource="{bindingSource}" PropertyName="Text" 
               DataMember="LastName"/>
        </DataBindings>
      </TextBox>

      <Label Location="220, 50" Size="100, 15" Text="First Name:"/>
      <TextBox Location="220, 65" Size="200, 20">
        <DataBindings>
          <cwf:DataBinding DataSource="{bindingSource}" PropertyName="Text" 
               DataMember="FirstName"/>
        </DataBindings>
      </TextBox>

      <Label Location="10, 95" Size="100, 15" Text="Address:"/>
      <TextBox Location="10, 110" Size="300, 20">
        <DataBindings>
          <cwf:DataBinding DataSource="{bindingSource}" PropertyName="Text" 
               DataMember="Address"/>
        </DataBindings>
      </TextBox>

      <Label Location="10, 140" Size="100, 15" Text="City:"/>
      <TextBox Location="10, 155" Size="100, 20">
        <DataBindings>
          <cwf:DataBinding DataSource="{bindingSource}" PropertyName="Text" 
               DataMember="City"/>
        </DataBindings>
      </TextBox>

      <Label Location="120, 140" Size="60, 15" Text="State:"/>
      <TextBox Location="120, 155" Size="60, 15">
        <DataBindings>
          <cwf:DataBinding DataSource="{bindingSource}" PropertyName="Text" 
               DataMember="State"/>
        </DataBindings>

      </TextBox>
      <Label Location="200, 140" Size="60, 15" Text="Zip:"/>
      <TextBox Location="200, 155" Size="60, 15">
        <DataBindings>
          <cwf:DataBinding DataSource="{bindingSource}" 
                      PropertyName="Text" DataMember="Zip"/>
        </DataBindings>
      </TextBox>

      <Button def:Name="btnOK" Location="450, 10" Size="80, 25" 
                                   Text="OK" Click="{app.OnOK}"/>
      <Button def:Name="btnCancel" Location="450, 35" Size="80, 25" 
                              Text="Cancel" Click="{app.OnCancel}"/>
    </Controls>
  </Form>
</MyXaml>

The following C# code instantiates the dialog and handles the dialog result:

C#
protected void OnPreAdd(object sender, EventArgs e)
{
    Parser p2 = new Parser();
    p2.AddToCollection += 
         new Parser.AddToCollectionDlgt(OnAddToCollection);
    p2.AddOrUpdateReferences(p.References);
    Form form = (Form)p2.Instantiate("personInfoDlg.myxaml", "*");
    bindingSource.AddNew();
    DialogResult res=form.ShowDialog();

    if (res == DialogResult.OK)
    {
        bindingSource.EndEdit();
    }
    else
    {
        bindingSource.CancelEdit();
        dataLog.CollectUncommittedRows();

        // Refresh the transaction grid.
        dgTransactions.DataSource = null;
        dgTransactions.DataSource = dataLog.Log;
    }
}

We can see that entering information in the dialog:

Image 1

results in the row being added to the DataGrid bound to the same BindingSource:

Image 2

The important thing about pre-add though is this line of code:

C#
Form form = (Form)p2.Instantiate("personInfoDlg.myxaml", "*");
bindingSource.AddNew();
DialogResult res=form.ShowDialog();

Behind the scenes:

  • A new row is added before the dialog is displayed.
  • Controls whose properties are bound to the data source's fields are cleared.
  • DataTable transactions occur because the control's properties are bound to DataTable fields.
  • Changes to the bound controls immediately affect the corresponding field of the bound DataTable.

Weird side-effects

Here's an interesting side-effect: as I'm changing the values in the controls, the underlying DataTable is being updated, and hence any control, like a grid, that is also bound to it. While this doesn't seem like it's happening, simply move the edit dialog away from the data grid that it's covering in the main window. This causes the DataGrid to refresh, and presto, the data that I'm editing is suddenly present:

Image 3

This is a problem. The data is being updated to the master DataTable before actually "committing". If you're using the same DataTable in, say, a search dialog, the user will see the changes that they're in the middle of updating! If your application supports modeless dialogs, again sharing a system-wide DataTable, those dialogs will auto-magically be updated! This is not the preferred behavior, certainly not the behavior I prefer.

Post-add

A post-add operation will add the row after the data has been entered. Typically, data binding is not used - there is no DataRow instance for the currency manager to work with so that the row's fields are updated. Instead, one usually creates the row, populates it with the values in the controls, and adds the row to the DataTable instance.

Instead of the procedure described above, I want to use a slightly different approach - using the DataTableTransactionLog that I've discussed in the preceding articles. Instead of binding the controls to the master DataTable, I'm going to clone the DataTable to get the structure then use the cloned DataTable as my data source. If the user clicks "OK", the transactions are synchronized. This is accomplished in the OnPostAdd event and involves some minor modifications to what you saw in the OnPreAdd event:

C#
protected void OnPostAdd(object sender, EventArgs e)
{
  Parser p2 = new Parser();
  p2.AddToCollection += 
      newParser.AddToCollectionDlgt(OnAddToCollection);

  // Clone the table structure.
  DataTable dtTemp = dataTable.Clone();

  // Create a new binding source.
  BindingSource bsTemp = new BindingSource(dtTemp, null);

  // Create a temp log for the temp table.
  DataTableTransactionLog logTemp = 
                     new DataTableTransactionLog(dtTemp);

  // but really, we're still creating the row first!
  DataRowView newRow=(DataRowView)bsTemp.AddNew();

  // Initialize the PK.
  newRow["PK"] = Guid.NewGuid();

  // Some MyXaml stuff.
  p2.References["app"] = this;
  p2.References["bindingSource"] = bsTemp;
  Form form = 
       (Form)p2.Instantiate("personInfoDlg.myxaml", "*");
  DialogResult res = form.ShowDialog();

  if (res == DialogResult.OK)
  {
    // Instantiate a syncMgr for the master table.
    DataTableSynchronizationManager sync = 
            new DataTableSynchronizationManager(dataLog);

    // Instantiate a syncMgr for the temp table.
    DataTableSynchronizationManager syncTemp = 
            new DataTableSynchronizationManager(logTemp);

    // Get the transactions.
    List<TransactionRecordPacket> packets = 
                              syncTemp.GetTransactions();

    // Add them to the master log.
    sync.AddTransactions(packets);

    // Sync up.
    sync.Sync();

    // All done.
    dataLog.AcceptChanges(); 

    dgTransactions.DataSource = null;
    dgTransactions.DataSource = dataLog.Log;
  }
}

It's interesting how we can use the exact same dialog to work in both configurations. But notice the comment "but really, we're still creating the row first!" Internally, we're creating the row first, but externally, to the rest of the application's controls bound to the table, we're only adding the row when the user actually clicks on OK.

True post-add

But what I really want to accomplish is expressed in this dialog:

Image 4

The idea here, expressed in the caption "Multi-Add", will allow the user to enter multiple names and addresses:

  • Without having to click on an "Add" button somewhere in the application.
  • Will maintain the previous record's information so it can be re-used.

Why do this? For example, there are three people living in my house - me, my girlfriend, and my son. Why should I have to re-enter the address information if I'm just adding three separate records for the people? So, after that lengthy introduction, we can finally get to the point of this article - how to do this, so that:

  • The DataTable is not actually updated until the user clicks on Add New or Update (thus avoiding our side-effects).
  • A new row doesn't clear the controls bound to the data source.
  • We can log the transactions so that they can be used to synchronize a mirror table, say on a server.
  • We can offer undo/redo capability to the user for the record that they're editing (is this silly?).

What's clear is that data binding cannot be directly used in the above dialog because of constraints on when the new row is created. In the above dialog, the new row is truly created (or is it?) when the user clicks on "Add New".

But...

In many ways, the easy answer is to create the row in the AddNew event handler and manually initialize the row's fields with the control values. Frankly, if I have to do that to a hundred dialogs in my client's application, I'll go crazy. ORM? No, I don't want the client to be associated with the data access layer other than the fact that it binds to a DataTable supplied by the server. I specifically do not want to create a client that either interfaces directly with the persistent store or has classes that are specific to the persistent store schema, directly or indirectly. The goal is to have a generic client, capable of working with data in this way, whether its managing people, movies, books, whatever. As for just doing the whole thing using a DataGrid, well, maybe the user prefers a dialog like the one above instead of a DataGrid. DataGrids make life easier for the programmer, but is it what the user wants? Maybe, maybe not.

RowTransactionSandbox

The class that we're going to add to the transaction logger and synchronization manager is a row transaction sandbox. This class manages a sandboxed DataTable and DataRow, which is isolated from the source DataTable until specifically synchronized (using our previous classes) with the source table.

UML diagram

The following illustrates how the sandbox fits into the previous article's UML diagram:

Image 5

The UML diagram illustrates how the sandbox manages transaction logs for both the master (source) data and the sandboxed data. It also manages the synchronization managers so that it can transfer the transaction packets from the sandbox log to the master log.

Events and data flow diagram

The following diagram illustrates the events and data flow:

Image 6

The above diagram illustrates the sandbox being initialized from either a "new row" or "existing row" event. If the sandbox is initialized with an existing row, that data has to come from a row in the master table. The sandbox, since it's managing the transactions for a single row, contains the set of PK values for that row. Initially, these values are null on a new row, or populated if starting with an existing row. Furthermore, the row fields are bound to the various controls in the dialog.

When the user adds a record, the application has to create new PK values, whether they existed or not. If the record is being updated, then the nothing special happens with the PK values - they already exist. In both cases, the sandbox gets the row transactions, using the PK values in the transaction set. The transaction packets are passed to the master sync manager and the master source is updated. Since the row now exists, the sandbox is re-initialized with the existing row data, which sets up the transactions for potentially adding a new record with existing data.

State diagram

And lastly, this diagram illustrates the state transitions, undo/redo, and log management:

Image 7

The above diagram should help to understand the state transition from adding a new row to updating an existing row, and potentially going back into the "new" state if the record is cleared. The diagram also illustrates that when a record is added, the "New Row" transaction is the first item in the log, and the original value transactions plus change transactions are collected from the sync manager. If the record is being updated, only the change transactions are needed - the rest of the transaction log is discarded (we'll see this in the code).

Implementation

For all those diagrams, the implementation is actually quite simple. It's nice when a complicated concept can be implemented without a lot of code.

Initialization

In the Initialize method:

C#
public void Initialize()
{
  if (sourceLogger == null)
  {
    throw new DataTableTransactionException(
       "SourceLogger must be initialized.");
  }

  state = SandboxState.New;
  sandboxTable = sourceLogger.SourceTable.Clone();
  sandboxLogger = new DataTableTransactionLog(sandboxTable);
  sourceSyncMgr = new DataTableSynchronizationManager(sourceLogger);
  sandboxSyncMgr = new DataTableSynchronizationManager(sandboxLogger);
  sandboxLogger.RowAdding += new DataTableTransactionLog.RowAddedDlgt(
       OnRowAdding);
}

the sandbox:

  • clones the source DataTable, which copies the structure information,
  • initializes its own transaction logger,
  • initializes synchronization managers for both the sandbox and the source logger,
  • hooks the RowAdding event for some post-initialization after the BindingSource has added a row.

In the demo, the sandbox and BindingSource is initialized declaratively:

XML
<cd:RowTransactionSandbox def:Name="tSandbox" 
                      SourceLogger="{dataLog}"/>
<BindingSource def:Name="tBindingSource" 
          DataSource="{tSandbox.SandboxTable}"/>

Note that the sandbox's cloned DataTable is used as the DataSource.

The RowAdding handler does some final initialization, tracking the row that was added and setting the state. Usually, the BindingSource.AddNew method is only called once for the entire lifetime of the sandbox, so the main purpose of this handler is to actually get the new row and save it:

C#
void OnRowAdding(object sender, RowAddedEventArgs e)
{
  row = e.Record.Row;
  State = SandboxState.New;
  lastOriginalIdx = 0;
}

Setup for adding a record

Once the BindingSource.AddNew() method is called (on the BindingSource attached to the sandbox's DataTable instance), the sandbox is set to go with tracking transactions on a new row. The following code from the demo shows this:

C#
protected void OnMultiAdd(object sender, EventArgs e)
{
  Parser p2 = new Parser();
  p2.AddToCollection += 
      new Parser.AddToCollectionDlgt(OnAddToCollection);
  p2.AddOrUpdateReferences(p.References);
  Form form = 
      (Form)p2.Instantiate("personInfoMultiAddDlg.myxaml", "*");
  form.Tag = p2;
  BindingSource bsTemp = 
      (BindingSource)p2.GetReference("tBindingSource");
  bsTemp.AddNew();     // This is the important step!
  form.ShowDialog();
}

Setup for updating a record

Let's say you have a record that you want to start by updating, or modifying to create a new record. The process is exactly the same, except that the sandbox's BeginEdit() method is called. This method initializes the sandbox's row with the row from the source row, and also initializes the PK values to be used for the entire transaction "set" managed by the sandbox:

C#
public void BeginEdit(DataRow srcRow)
{
  if (row == null)
  {
    throw new DataTableTransactionException(
       "Row not initialized. Call AddNew() on " + 
       "the binding source for the sandbox DataTabe first.");
  }

  foreach (DataColumn dc in sourceLogger.SourceTable.Columns)
  {
    row[dc.ColumnName] = srcRow[dc.ColumnName];
  }

  pkValues = new Dictionary<string, object>();

  foreach (DataColumn dc in sourceLogger.SourceTable.PrimaryKey)
  {
    pkValues[dc.ColumnName] = row[dc.ColumnName];
  }

  lastOriginalIdx = sandboxLogger.Log.Count;
  State = SandboxState.Existing;
}

What's critical in the code above is that this initialization is creating transaction records when the source values are copied to the sandbox values. So, if the user decides to modify the current values (which also generates transactions) and then add the record as a new record, the sandbox will have a complete set of information - all the existing transactions plus any new ones that are already in the sandbox's log, ready for synchronization to the source's transaction log. Also, there's an internal field, lastOriginalIdx, which preserves the count of how many transaction records consist of this initial transaction list. When doing an update, all of these transactions can be discarded!

In the demo, the "Update" menu event handler calls the following method. Notice the call to BeginEdit() after the AddNew() call:

C#
protected void OnMultiUpdate(object sender, EventArgs e)
{
  Parser p2 = new Parser();
  p2.AddToCollection += 
      new Parser.AddToCollectionDlgt(OnAddToCollection);
  p2.AddOrUpdateReferences(p.References);
  Form form = 
      (Form)p2.Instantiate("personInfoMultiAddDlg.myxaml", "*");
  form.Tag = p2;
  BindingSource bsTemp = 
      (BindingSource)p2.GetReference("tBindingSource");
  RowTransactionSandbox ts = 
      (RowTransactionSandbox)p2.GetReference("tSandbox");
  bsTemp.AddNew();        // Create the sandbox's row
  ts.BeginEdit(((DataRowView)bindingSource.Current).Row);
  form.ShowDialog();
}

Add, Update, Clear, and Delete

The sandbox's Add method is called when a row, whether it exists or not in the original DataTable, is to be added. This code requires that the application supplies the new PK values. These values are saved in the PK fields, and the entire sandbox transaction log is compacted, meaning that if you change a field from a, to b, to c, only the transaction from a to c is preserved. (The compactor also removes any transactions associated with a delete row transaction.) The transaction packets are acquired and the source is updated:

C#
public void Add(Dictionary<string, object> pkValues)
{
   if (row == null)
   {
     throw new DataTableTransactionException(
         "Row not initialized. Call AddNew() on the binding " + 
         "source for the sandbox DataTabe first.");
   }

   this.pkValues = pkValues;
   UpdateRowPKValues();
   sandboxLogger.Compact();
   List<TransactionRecordPacket> trpList;
   trpList = sandboxSyncMgr.GetTransactions(pkValues);
   UpdateSource(trpList);
}

Update is a similar process, except there are no PK values to initialize. Also, compaction can't be done because this would move changed fields into the initialization transactions (created from the source row) that are being discarded in the RemoveRange call. At some point, that "issue" can be refactored. Here's the code:

C#
public void Update()
{
   if (row == null)
   {
      throw new DataTableTransactionException(
         "Row not initialized. Call AddNew() on the " + 
         "binding source for the sandbox DataTabe first.");
   }

   if (state != SandboxState.Existing)
   {
      throw new DataTableTransactionException(
                                "Can't update a new row.");
   }

   // Can't compact, as this changes the 
   // transaction list ordering. 
   // sandboxLogger.Compact(); 
   List<TransactionRecordPacket> trpList;
   trpList = sandboxSyncMgr.GetTransactions(pkValues);
   trpList.RemoveRange(0, lastOriginalIdx); 
   UpdateSource(trpList);
}

The delete process directly manipulates the transaction log, discarding everything and injecting a DeleteRow transaction:

C#
public void Delete()
{
   if (row == null)
   {
      throw new DataTableTransactionException(
        "Row not initialized. Call AddNew() on the " + 
        "binding source for the sandbox DataTabe first.");
   }

   if (state != SandboxState.Existing)
   {
      throw new DataTableTransactionException(
                               "Can't delete a new row.");
   }

   sandboxLogger.ClearLog();
   sandboxLogger.Log.Add(new DataTableTransactionRecord(0, 
       row, DataTableTransactionRecord.RecordType.DeleteRow));
   List<TransactionRecordPacket> trpList;
   trpList = sandboxSyncMgr.GetTransactions(pkValues);
   UpdateSource(trpList);
   Clear();
}

Keep in mind, I specifically want to log transactions happening to the master table so that a local table and a remote table can be synchronized. So it may seem like a lot of work to simply delete a row in the master table, the point can be illustrated by this screenshot:

Image 8

As you can see, there's the DeleteRow transaction in the master table!

The Clear method clears all transactions and puts the sandbox in the "new row" state. Note how it injects a "NewRow" transaction so that we're all set to add a new row and its field values:

C#
public void Clear()
{
   if (row == null)
   {
     throw new DataTableTransactionException(
       "Row not initialized. Call AddNew() on the " + 
       "binding source for the sandbox DataTabe first.");
   }

   ClearAllFields();
   // Clear the log, as the only transaction 
   // allowed now is add.
   sandboxLogger.ClearLog();

   // Setup for an "Add".
   sandboxLogger.Log.Add(new DataTableTransactionRecord(0, 
        row, DataTableTransactionRecord.RecordType.NewRow));
   State = SandboxState.New;
}

Now, the real magic happens in the UpdateSource method. Here, the sandbox transaction packets are added to the source sync manager and the source DataTable is synchronized. The sandbox transaction log is cleared, a "New Row" transaction record is injected to handle the case where a new row might be added using some of the existing field values, and the BeginEdit method is called, setting up all the initial values. The sandbox is now in the "update" state:

C#
protected void UpdateSource(
                 List<TransactionRecordPacket> trpList)
{
   sourceSyncMgr.AddTransactions(trpList);
   sourceSyncMgr.Sync();
   sandboxLogger.ClearLog();

   // Setup for an "Add".
   sandboxLogger.Log.Add(new DataTableTransactionRecord(0, 
        row, DataTableTransactionRecord.RecordType.NewRow));
   BeginEdit(row);
}

Demo application

The demo application does a lot with both declarative and imperative code, and illustrates the pre-add, post-add, and multi-add/update processes. All the transactions are displayed in the demo, so you can see the difference in how these operate with regards to creating transactions and updating the master data. Note that in this article I'm not covering state management - the menu item state and button state on the multi-adder dialog.

Conclusion

In some ways, it would have been easier to include the management of the BindingSource in the sandbox code, however, the BindingSource class requires System.Windows.Forms, and I wanted to keep the sandbox free of this requirement. This means that the application has to be a bit more responsible for the interface with the sandbox than might be initially desirable.

Using the classes discussed in this article and the two previous articles, the programmer can now create data entry screens in a variety of modes, depending on the user requirements. The data transactions, being logged, are suitable for synchronization with a remote source.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Architect Interacx
United States United States
Blog: https://marcclifton.wordpress.com/
Home Page: http://www.marcclifton.com
Research: http://www.higherorderprogramming.com/
GitHub: https://github.com/cliftonm

All my life I have been passionate about architecture / software design, as this is the cornerstone to a maintainable and extensible application. As such, I have enjoyed exploring some crazy ideas and discovering that they are not so crazy after all. I also love writing about my ideas and seeing the community response. As a consultant, I've enjoyed working in a wide range of industries such as aerospace, boatyard management, remote sensing, emergency services / data management, and casino operations. I've done a variety of pro-bono work non-profit organizations related to nature conservancy, drug recovery and women's health.

Comments and Discussions

 
-- There are no messages in this forum --