I usually cheat somewhat with those.
When I read a JSON source, I want to inject it inside a database.
At first, I tried to decode it properly and so on.
Was messy and not optimized.
So I went the brute hard-force way : BULK INSERT from the JSON flux to a table, then I have a stored procedure that grabs data from that input table (where everything gets injected from the JSON as String) and the procedure does the conversions and tests using an UPDATE/INSERT.
JSON -> Bulk Insert to TEMP_BLAH
Call (stored_procedure_blah) : moves from TEMP_BLAH to BLAH
doing either an INSERT if data not present, or an UPDATE is already present.
It's fast. It puts a temporary table where the JSON data gets dropped,
so the internal table with converted data is kept safe for content.
The C# code itself is simple :
create a
HttpWebRequest
object
You inject inside of it the JSON URL like this :
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Url);
Url here contains a string with the JSON URL. Then :
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Url);
try
{
WebResponse response = request.GetResponse();
using (Stream responseStream = response.GetResponseStream())
{
StreamReader reader = new StreamReader(responseStream, Encoding.UTF8);
json = reader.ReadToEnd();
reader.Close();
}
}
catch (WebException ex)
{
WebResponse errorResponse = ex.Response;
using (Stream responseStream = errorResponse.GetResponseStream())
{
StreamReader reader = new StreamReader(responseStream, Encoding.UTF8);
Console.WriteLine(reader.ReadToEnd());
return some_errror_value;
}
}
(this is code written by hand and not tested but you get the idea of how to do it)
json is a string. It now contains the JSON data.
I then use an XmlDocument to Deserialize the JSON :
System.Xml.XmlDocument xml = JsonConvert.DeserializeXmlNode(json, TableName);
TableName is a variable that contains the name of the table I want to inject the contents into.
DataSet dataSet = new DataSet();
dataSet.ReadXml(XmlReader.Create(new StringReader(xml.InnerXml)), XmlReadMode.InferSchema);
Data goes from Xml to DataSet
Then I open a connection to the SQL Database and inject the data I have as a BULK INSERT :
SqlConnection conn = new SqlConnection(ConnectionString);
conn.Open();
ConnectionString
contains your SQL usual connection string
foreach(DataTable dt in dataSet.Tables)
{
string TableName = "[IMPORT_" + dt.TableName.Replace('{', ' ').Replace('}', ' ') + "]";
ExecuteNonQuery(conn, String.Format("DROP TABLE {0}", TableName));
SqlBulkCopy bulk = new SqlBulkCopy(conn);
bulk.DestinationTableName = TableName;
bulk.WriteToServer(dt);
}
static private bool ExecuteNonQuery(SqlConnection conn, string sql)
{
try
{
using (SqlCommand cmd = conn.CreateCommand())
{
Debug.WriteLine(sql);
cmd.CommandText = sql; cmd.ExecuteNonQuery();
}
Debug.WriteLine("ok");
return true;
}
catch (Exception e)
{
Console.WriteLine(e.Message);
return false;
}
}
And this does the BULK INSERT work.
I wrote all of this as I recall it. You might have to do a little work for a complete solution, but gives you an idea of how I do it.
It's brutal but it works very nicely. I have a dozen of JSON sources, for each I grab the JSON and inject into an IMPORT_blabla table using BULK INSERT and dropping the existing table. You can do a TRUNCATE instead of a DROP if you prefer.
Then, you have your data in IMPORT_* tables and a stored procedure will do the INSERT/UPDATE work as needed.