Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / database / NoSQL

Friday the 13th - JSON is Coming to SQL Server

4.98/5 (51 votes)
8 Dec 2015CPOL17 min read 68.8K  
Overview of built-in support for JSON text processing in new SQL Server
In this article, I will show you the capabilities of SQL Server for processing JSON text.

Table of Contents

Introduction

This fall, J(a)SON is coming into New SQL Server!

If you expected a new movie in Friday the 13th series you might be disappointed - Jason Voorhees will not come into SQL Server. However, JavaScript Object Notation (JSON) support is coming into new SQL Server. Example of text formatted as JSON is shown in the following example:

JavaScript
{
    "info":{  
      "type":1,
      "address":{ "town":"Bristol","country":"England" },
      "tags":["Sport", "Water polo"]
   },
   "type":"Basic"
}

JSON is simple human-readable format that can contain complex data structures such as key values, objects, arrays, etc. New SQL Server enables you to process JSON text.

You probably know that there is an ongoing "war" between relational and NoSQL concepts. Both approaches have their good and bad sides.

Relational databases are traditional data storage mechanisms. Intuitive SQL language, rich query capabilities, transactions, powerful indexes, and optimization for updates are the most important advantages of relational models. However, the bad thing is that you might end up with complex database schema and a lot of JOINs to retrieve data.

NoSQL offers different concepts - complex structures are placed together into collections of entities where you can take everything you need with one read operation, or where you can insert complex structure with a single write. The bad side is that sometimes, you want to organize your information in different collections and then you will find that it is very hard to JOIN entities from two collections.

With new SQL server, you have options to choose between these two concepts and use the best of both worlds. In your data models, you can choose when to use traditional relational structures and when to introduce NoSQL concepts.

  • If you have parent/child relationships where related child information is not changed frequently and you need to read child records together with the parent without additional JOINS, you can store child records in parent table as JSON array.
  • If you have information that is frequently updated, you should store it in regular tables. Regular tables are the best choice for frequently changed data.

SQL Server 2016 and Azure SQL Database will have built-in support for processing text formatted in JSON format. First functionalities - formatting results of SELECT query as JSON text are already available in SQL Server 2016 CTP2, and in SQL Server 2016 CTP3 are added functions for parsing JSON text. Some additional enhancements will come by RTM.

In this article, we will see what will be included in this version.

Background

JSON support in SQL Server is one of the highly ranked feature requests on Microsoft Connect, with more than 1000 votes. In SQL Server 2016, built-in support for JSON text processing will be added. High-level overview of JSON features in SQL Server 2016 is shown in the following figure:

Image 1

SQL Server will provide a set of built-in functions that enable you to validate that JSON text is properly formatted, to find a value from JSON text on some JavaScript-like path, find a fragment from a JSON, etc.

These are some basic functions that enable you to parse JSON text without some CLR or complex regular expressions.

If you need more advanced querying, you can transform JSON text to a standard table using OPENJSON table value function. This function iterates through the array of objects in the JSON text and creates one table-row for each object. Key:values from JSON are converted into header:cell pairs. When you convert JSON text to table, you can use any standard SQL server function and operator (group by, aggregates, etc.)

Finally, if you have table data that you want to transform to JSON text, there is new FOR JSON clause in SELECT statement. FOR JSON clause will format result set from table into an array of JSON objects. This option takes header:cell values and converts them in key:value pairs in JSON text.

Getting Started

In this section, we will see the short overview of existing functionalities. JSON in text, so you can store it in the standard text variables:

SQL
DECLARE @json NVARCHAR(4000)
SET @json = 
N'{
    "info":{  
      "type":1,
      "address":{  
        "town":"Bristol",
        "county":"Avon",
        "country":"England"
      },
      "tags":["Sport", "Water polo"]
   },
   "type":"Basic"
}'

I will use this variable in the following examples.

JSON_VALUE function enables you to get value on some JavaScript-like path in JSON:

SQL
SELECT
  JSON_VALUE(@json, '$.type') a,
  JSON_VALUE(@json, '$.info.type') b,
  JSON_VALUE(@json, '$.info.address.town') c,
  JSON_VALUE(@json, '$.info.tags[0]') d
a b c d
Basic 1 Bristol Sport

JSON_QUERY function enables you to get object or array in JSON text using some JavaScript-like path:

SQL
SELECT
  JSON_QUERY(@json, '$') as [object],
  JSON_QUERY(@json, '$.info') as info,
  JSON_QUERY(@json, '$.info.address') as address,
  JSON_QUERY(@json, '$.info.tags') as tags

This query will return the following results:

object info address tags
{ "info":{ "type":1, "address":{ "town":"Bristol", "county":"Avon", "country":"England" }, "tags":["Sport", "Water polo"] }, "type":"Basic" } { "type":1, "address":{ "town":"Bristol", "county":"Avon", "country":"England" }, "tags":["Sport", "Water polo"] } { "town":"Bristol", "county":"Avon", "country":"England" } ["Sport", "Water polo"]

ISJSON function will return value 1 if JSON text is properly formatted:

SQL
SELECT
  ISJSON(@json),
  ISJSON(JSON_QUERY(@json, '$.info')),
  ISJSON(JSON_VALUE(@json, '$.info.type'))

The first two columns returns 1 while the third returns 0 because value in $.info.type property in number and not valid JSON.

OPENJSON will return set of rows using elements in JSON array or key:value pairs in JSON object:

SQL
SELECT [key] as property, value
FROM OPENJSON(@json, '$.info.address')

SELECT [key] as i, value
FROM OPENJSON(@json, '$.info.tags')

There is another form of OPENJSON function that takes elements from objects and strongly casts them the specified types:

SQL
SELECT *
FROM OPENJSON(@json, '$.info')
WITH ( type int,
       town nvarchar(50) '$.address.town',
       country nvarchar(50) '$.address.country'
     )

Finally, FOR JSON enables you to format results of SELECT query as JSON text:

SQL
SELECT FirstName, LastName
FROM Person
FOR JSON PATH

In the following sections, we will see how you can use these functions in SQL Server.

Storing JSON in SQL Server Database

As I mentioned above, SQL Server adds the ability to process JSON text. Therefore, you will store it as regular text (NVARCHAR columns) in SQL Server. If you need a simple collection of JSON values like in Azure DocumentDB or MongoDB, you can create a simple two-column table:

SQL
CREATE TABLE Person
( _id int identity constraint PK_JSON_ID primary key,
  value nvarchar(max) 
       CONSTRAINT [Content should be formatted as JSON]
                 CHECK (ISJSON(value)>0)
)

You can create standard check constraint with ISJSON function that will ensure that text stored in JSON column will be valid. If you are sure that your text will be formatted as JSON, you can turn off check constraint.

However, SQL Server is more like a hybrid engine where you can combine both relational and JSON data. In the following example, simple Person table with standard columns and variable data stored as JSON text is shown.

SQL
CREATE TABLE Person(
    PersonID int IDENTITY PRIMARY KEY,
    FirstName dbo.Name NOT NULL,
    LastName dbo.Name NOT NULL,
    AdditionalInfo nvarchar(max) NULL,
    PhoneNumbers nvarchar(max) NULL,
    EmailAddresses nvarchar(max) NULL
        CONSTRAINT [Email addresses must be formatted as JSON array]
            CHECK  (isjson(EmailAddresses)>0)
)

If you compare this structure with Person table in AdventureWorks database, you will notice that you don't need additional tables (PersonPhoneNumbers, PersonEmailAddresses, etc.), foreign keys, and indexes for each array of strings or objects. Here, I will store phone numbers and email addresses associated to some person as JSON arrays. Also, any additional variable information I can place in the AdditionInfo column.

Now the important question - why you would store addresses and phones as JSON instead of separate tables? In some cases, you might improve performance of your queries. Imagine standard SQL query that read person, person_phone, and person_email information. You would need to scan three tables and use two JOINs to collect your data. With information de-normalized in the single table you have single table scan.

Another use case is import. If you import person, phones and emails in the single table it would be faster than importing data in person table, getting the generated identity value for id, using that identity to import phones and emails, and repeat this process for each person.

Drawback of this is query performance if you need to query single table anyhow. Referencing columns will be faster than referencing properties in the JSON text, so use this approach only if you want to pull columns from related tables in the parent rows.

In the following sections, we will see how you can query JSON text stored in the Person table.

Querying

The fact that information is formatted as JSON does not imply that they are lost. SQL Server is not blind for JSON data. Currently available functions are:

  • ISJSON - verifies that text is formatted as JSON
  • JSON_VALUE - returns a scalar value from JSON text on some JavaScript path
  • JSON_QUERY - returns a fragment (e.g., JSON array or sub-object) from JSON text

You can use JSON_VALUE and JSON_QUERY functions to take information from JSON text and use it in the queries:

SQL
SELECT PersonID, FirstName, LastName,
    JSON_VALUE(AdditionalInfo, '$.Title') Title,
    JSON_VALUE(AdditionalInfo, '$.NameStyle') NameStyle,
    JSON_VALUE(AdditionalInfo, '$.PaymentInfo.Salary') Salary,
    JSON_QUERY(AdditionalInfo, '$.Skills') Skills
    EmailAddresses
FROM Person
WHERE JSON_VALUE(AdditionalInfo, '$.Company') = @company
AND ISJSON(AdditionalInfo) > 0
ORDER BY JSON_VALUE(AdditionalInfo, '$.PaymentInfo.Salary')

You can use properties from JSON text together with regular columns in any part of the query such as WHERE clause, ORDER BY, GROUP BY, HAVING, etc. Note that all JSON built-in functions accept JavaScript like path that references some property within JSON text. JSON path starts with $, which represents first argument of function, followed by regular JavaScript syntax for referencing JSON fields. If your keys contain some non-alphanumeric characters, you should put them in double quotes, e.g. $."Sales Person"."First Name".

OPENJSON

JSON text is parsed using new OPENJSON function. In the simples form, OPENJSON returns all key:value pairs in JSON object:

SQL
SET @json =
N'{ "Title":"Mr",
    "PaymentInfo":{"Salary":1500,"Type":"Weekly"},
    "Company":"AdventureWorks",
    "Skills":["SQL",".Net","C#"]
}'

SELECT * 
FROM OPENJSON(@json)

This function will return all keys (Salary, PaymentInfo, Company, Skills) and their values:

key value
Title Mr
PaymentInfo {"Salary":1500,"Type":"Weekly"}
Company AdventureWorks
Skills ["SQL",".Net","C#"]

OPENJSON function can parse any JSON object and return strongly types row that can be used in any query, e.g.:

SQL
SET @json =
N'{"Title":"Mr", "PaymentInfo":{"Salary":1500,
"Type":"Weekly"},"Company":"AdventureWorks",
"Skills":["SQL",".Net","C#"]
}'

SELECT Title, Salary, Company 
FROM OPENJSON(@json)
     WITH (Title nvarchar(20), Salary int '$.PaymentInfo.Salary', Company  nvarchar(20))

OPENJSON function will parse JSON object, match JSON properties using the "column names" defined in WITH clause and return equivalent row with strongly typed columns. If some name of column does not match property path in JSON, you can add reference path after type (see '$.Payment.Info' example).

Title Salary Company
Mr 1500 AdventureWorks

This is useful if you have complex nested objects.

Besides, JSON objects, OPENJSON can parse JSON arrays. As an example, we can return values from Skills array in JSON object:

SQL
SET @json = 
N'{"Title":"Mr","PaymentInfo":{"Salary":1500,
"Type":"Weekly"},"Company":"AdventureWorks",
   "Skills":["SQL",".Net","C#"]
}'
SELECT value
FROM OPENJSON(@json, '$.Skills')

OPENJSON function will seek into Skills property, return values in Skills array (one row per element):

Elements in $.Skills array in the object

key value
0 SQL
1 .NET
2 C#

In this example, I have a simple array of string values, but it work with complex arrays of objects.

APPLY and OPENJSON

Whenever you have an array of JSON objects in some column, you might need to extract this array and join it to the parent row. As an example, we might look at the following structure:

ID

Name

PhoneNumbers

1

John

 

[{"Number":"06472643","Type":"Work"},

{"Number":"01164322","Type":"Home"}]

2

Jane

 

[{"Number":"01726443","Type":"Work"},

{"Number":"06243344","Type":Mobile"}]

3

Jack

 

[{"Number":"01167343","Type":"Home"}]

If you think about the phone number array as an internal virtual table of phone numbers, you might want to JOIN Person row with a set of phone numbers. You can JOIN array of JSON objects with parent row using CROSS APPLY operator:

SQL
SELECT ID,Name,Number,Type
FROM Person
  CROSS APPLY
  OPENJSON(PhoneNumbers)
  WITH (Number nvarchar(100), Type nvarchar(20))

OPENJSON will transform array of phone numbers and extract Number and Type properties as table column. Then CROSS APPLY will join this table to the parent Person row:

Image 2

CROSS APPLY and OPENJSON will be very common operation in queries where you read both relational columns and JSON text. Note that here are two operators that you might use - CROSS APPLY and OUTER APPLY. Key differences are:

  • CROSS APPLY will suppress parent row if OPENJSON does not return any rows from JSON array.
  • OUTER APPLY will return one parent row even if OPENJSON do not return anything.
  • If OPENJSON returns at least one row, both CROSS and OUTER APPLY will work the same way.

You can use OPENJSON to expand an array of values in some columns and search rows by some element, e.g.:

SQL
SELECT PersonID,FirstName,LastName,EmailAddresses, PhoneNumbers
FROM Person
    CROSS APPLY OPENJSON(EmailAddresses) EA
WHERE EA.value = 'john@mail.com'

OPENJSON function will take text in EmailAdresseses column, iterate through the elements of addresses, take every object, and return value of each element. Query will filter only rows that have EA.value equal to 'john@mail.com'.

CROSS APPLY/OPENJSON pair is able to process very complex JSON structures. As an example, we can parse complex JSON structures like GeoJSON structure for representing multiple-polygons:

Image 3
JavaScript
{ "type": "MultiPolygon", 
    "coordinates": [
        [
            [[40, 40], [20, 45], [45, 30], [40, 40]]
        ], 
        [
            [[20, 35], [10, 30], [10, 10], [30, 5], [45, 20], [20, 35]], 
            [[30, 20], [20, 15], [20, 25], [30, 20]]
        ]
    ]
}

Multi-Polygon represented in GeoJSON format has an array of polygons, where each polygon has an array of lines, each line has an array of coordinates, and each coordinate is represented as two-element array. In the relational structure, multi-polygon would be represented as a set of four tables. However, in JSON format, you have nested structure of arrays.

You can query this nested structure using the following query:

SQL
SELECT polygons.[key] as polygon, lines.[key] as line, x, y
FROM OPENJSON(@multipolygon, '$.coordinates') as polygons
       CROSS APPLY OPENJSON(polygons.value) as lines
              CROSS APPLY OPENJSON(lines.value)
                     WITH (x float '$[0]', y float '$[1]')

Each CROSS APPLY OPENJSON is equivalent to the JOIN to the table in relational model. First OPENJSON call will parse list of polygons at the first level. polygons.value that is coming from this OPENJSON contains array of lines so CROSS APPLY OPENSON(polygons.value) will return array of lines in that polygon. CROSS APPLY OPENJSON(lines.value) will parse coordinates in the line array and return x and y coordinates at index 0 and 1.

As you might notice with a couple of nested CROSS APPLY operators, you can parse any JSON structure.

Computed Columns

If you don't want to use JSON functions directly in queries, you can encapsulate them using computed columns and views.

As an example, if you frequently use Title and Payment.Salary fields from JSON column, you can add computed columns that reference these fields:

SQL
ALTER TABLE Person
ADD vCompany AS JSON_VALUE(AdditionalInfo, '$.Title'),
    vSalary  AS JSON_VALUE(AdditionalInfo, '$.PaymentInfo.Salary')

Now you can use computed columns instead of the direct calls to JSON_VALUE functions.

If you want to have relational view of JSON text, this is also possible. You just need to apply OPENJSON function on JSON text, specify path in JSON and return types of columns:

SQL
CREATE VIEW PersonInfo AS
SELECT PersonID, MiddleName, Salutation, Salary
FROM Person CROSS APPLY
    OPENJSON(AdditionalInfo)
    WITH(
        MiddleName nvarchar(40) '$.Info.MiddleName',
        Salutation nvarchar(40) '$.Info.Salutation',
        Salary float '$.PaymentInfo.Salary'
    )

OPENJSON will open JSON text in AdditionalInfo column, and return values on Info.MiddleName, Info.Salutation, and PaymentInfo.Salary paths. It will cast values in JSON text to the specified types and you will see it as virtual child table. If AdditionalInfo contains single JSON object, OPENJSON will return single rows. Otherwise, it will return table with one row for each element in JSON array.

Collation Aware JSON Functions

JSON function returns text that inherits collation from source column. The following query will always work:

SQL
SELECT PersonID,FirstName,LastName,EmailAddresses, PhoneNumbers 
FROM Person
WHERE JSON_VALUE(AdditionalInfo, '$.isActive') = 'true'

However, if you define that original text column that contains JSON text is case insensitive, the following query is also valid:

SQL
SELECT PersonID,FirstName,LastName,EmailAddresses, PhoneNumbers 
FROM Person
WHERE JSON_VALUE(AdditionalInfo, '$.isActive') = 'TRUE'

Equality predicate will ignore case sensitivity if JSON column AdditionalInfo is case insensitive.

Indexing JSON Data

SQL Server enables you to create standard indexes both on regular table columns and properties in JSON text.

As an example, if you want to index $.Salary property in AdditionalInfo column, you can create following index:

SQL
ALTER TABLE Person
ADD vCompany AS JSON_VALUE(AdditionalInfo, '$.Company'),
    vSalary  AS JSON_VALUE(AdditionalInfo, '$.PaymentInfo.Salary')
    vTitle   AS JSON_VALUE(AdditionalInfo, '$.Title'),
    vSkills  AS JSON_QUERY(AdditionalInfo, '$.Skills')

CREATE INDEX idx_Person_1
    ON Person(vCompany)
    INCLUDE(FirstName, LastName, vTitle, vSkills, vSalary)

First, you need to create virtual computed columns that reference properties in JSON column. Note that there is no additional space for these columns - they are computed only when they are used.

Then you can create regular index on these columns. Note that SQL Server enables you fine granularity while designing indexes. You can choose which fields will be used to filter or sort records, and which fields will be used just as included columns in index (optimized for additional columns in SELECT list).

If you run initial query, you will see that instead of full table scan, it uses new index. You don't need to rewrite query, SQL Server will know that JSON_VALUE functions in the query should be mapped to virtual columns that are indexed. See more details about indexing JSON document on MSDN blog post.

The fact that JSON is represented as text enables you to create standard full-text search index on JSON. This is good choice if you have an array of JSON numbers or strings like array of email addresses in the example above. You can create full text search index on EmaillAddress column using the following code:

SQL
CREATE FULLTEXT INDEX ON Person(EmailAddresses)
    KEY INDEX PK_Person_ID
    ON jsonFullTextCatalog;

FTS index will split text in the JSON array and index each text element. The following query can be used to find all Person rows that contain email address john@mail.com:

SQL
SELECT PersonID,FirstName,LastName,EmailAddresses, PhoneNumbers 
FROM Person
WHERE CONTAINS(EmailAddresses, 'john@mail.com')

Full-text search enables you to create complex queries with AND, OR and NOT operators, e.g.:

SQL
SELECT PersonID,FirstName,LastName,EmailAddresses, PhoneNumbers
FROM Person
WHERE CONTAINS(EmailAddresses, 'john@mail.com OR jovan@mail.com')

As you can see, SQL Server provides a simple but powerful mechanism for querying and indexing JSON text. You can find more details in Indexing JSON arrays using full-text search indexes article.

Formatting Relational Data as JSON

If you want to return information from the tables formatted as JSON, you can simply add FOR JSON PATH clause at the end of the query:

SQL
SELECT PersonID,FirstName,LastName,
       JSON_QUERY(EmailAddresses) AS 'Contact.Emails',
       JSON_QUERY(PhoneNumbers) AS 'Contact.Phones'
FROM Person
WHERE CONTAINS(EmailAddresses, 'john@mail.com')
FOR JSON PATH

FOR JSON PATH clause will concatenate all records from the returned set of rows and create key:value pairs using the column name/aliases and values in rows. Result might look like the following example:

JavaScript
[
 { "FirstName":"John",
   "LastName":"Doe",
   "Contact": {
        "Emails": ["john@mail.com","john@aw.com"],
        "Phones": ["+381 55 555 555"]
 },
 { "FirstName":"Jane",
   "LastName":"Doe",
   "Contact": {
        "Emails": ["jane@mail.com","jane@aw.com"],
        "Phones": []
 },
 ...
]

Since both emails and phones have Contact prefix, FOR JSON PATH will group them together in one object.

Note one important detail - I have wrapped EmailAddreses and PhoneNumbers with JSON_QUERY(). If I return these columns as a plain text, FOR JSON will treat them as regular text, wrap them with quotes, escape characters in the text according to JSON escaping rules. However, if I pass values to JSON_QUERY function, FOR JSON will treat these text values as JSON and it will just include them in the output. It looks like casting JSON text into JSON.

FOR JSON clause might be useful when you one to many structure and you need to join parent row (e.g., Person) to a set of child rows (e.g., PersonPhones, PersonPhoneTypes). Imagine that you have relational structure where person phone numbers are organized in PersonProne and PersonPhoneNumberType tables. Regular JOIN will increase number of rows because it will return one row in the result set for each Person:Phone pair so you will need to process results at the client side. With FOR JSON, you can return all related phone numbers as a single cell in the one row:

SQL
SELECT PersonID,FirstName,LastName,
        (SELECT PersonPhone.PhoneNumber, PhoneNumberType.Name AS PhoneNumberType
            FROM  Person.PersonPhone
            INNER JOIN PhoneNumberType
                ON PersonPhone.PhoneNumberTypeID = PhoneNumberType.PhoneNumberTypeID
         WHERE Person.PersonID = PersonPhone.PersonID
        FOR JSON PATH) AS PhoneNumbers

Another scenario where you will use FOR JSON is de-normalization. Instead of joining related tables, you might want to simplify schema and for each person move set of related phone numbers into the array of phone numbers in Person.PhoneNumbers column. You can use something like the following query:

SQL
UPDATE Person
SET PhoneNumbers =
        (SELECT PersonPhone.PhoneNumber, PhoneNumberType.Name AS PhoneNumberType
            FROM  Person.PersonPhone
            INNER JOIN PhoneNumberType
                ON PersonPhone.PhoneNumberTypeID = PhoneNumberType.PhoneNumberTypeID
         WHERE Person.PersonID = PersonPhone.PersonID
        FOR JSON PATH) 

Inner query will select all phones and types related to the person that is currently updated. FOR JSON PATH clause will format set of rows into single JSON array that can be stored as a text in PhoneNumbers column.

Now your select query will have only one table scan instead of three table scans with two JOIN operators which might improve your performance.

Importing JSON Data

Finally, we will see how you can import JSON text into tables. Imagine that you want to import text formatted as JSON into database. With OPENJSON function, you can easily import text formatted as JSON into any table. OPENJSON function will parse JSON text and return all array elements. The only thing you need to do is to insert rows returned by OPENJSON into table:

SQL
INSERT INTO Person(FirstName,LastName,EmailAddresses, PhoneNumbers)
SELECT FirstName,LastName,EmailAddresses, PhoneNumbers
FROM OPENJSON(@PersonJSON)
    WITH(
        FirstName nvarchar(40),
        LastName nvarchar(40),
        AdditionalInfo NVARCHAR(MAX) AS JSON,
        EmailAddresses NVARCHAR(MAX) AS JSON,
        PhoneNumbers NVARCHAR(MAX) AS JSON
    )

In the WITH clause, you can specify what keys from JSON objects should be fetched. OPENJSON will match keys by column names and return values. If some key in the JSON object references nested JSON object (e.g., EmailAddresses and PhoneNumbers keys reference inner arrays instead of simple strings), you need to as AS JSON option.

In this example, assumption is that @PersonJSON variable/parameter contains JSON text with flat structure of fields. However, OPENJSON may even parse complex structures with nested elements.

In the following example, SQL code that reads content of the file in file system using OPENROWSET BULK function is shown, and pass content of file (BulkColumn) to OPENJSON function:

SQL
INSERT INTO Person(FirstName,LastName,EmailAddresses, PhoneNumbers)
SELECT person.* FROM
OPENROWSET(BULK N't:\share\people.json', SINGLE_CLOB) AS json
            CROSS APPLY OPENJSON(BulkColumn)
                        WITH(
                            FirstName nvarchar(40),    LastName nvarchar(40),
                            AdditionalInfo NVARCHAR(MAX) AS JSON,
                            EmailAddresses NVARCHAR(MAX) AS JSON,
                            PhoneNumbers NVARCHAR(MAX) AS JSON ) AS person

OPENJSON function will parse JSON content in the file and insert it into Person table.

This code can work both with local files, but you can also load files from the cloud. Azure File Storage supports SMB protocol, so you can map some share in Azure Files as you mapped disk and reference files in the cloud directly as they are in the local file system. This might be the easiest way to import JSON files from the cloud into your database. You can find more examples in this post, Inserting JSON Text into SQL Server Table.

Utilities

Since JSON is simple format, you can use it to pass list of values to SQL Server as parameter. Imagine that you want to return set of Person rows from the database based on a list of primary key values. You can create simple procedure shown in the following code:

SQL
create procedure
dbo.GetPersonList(@PersonIds nvarchar(100))
 as begin

 SELECT *
 FROM Person.Person 
         JOIN OPENJSON(@PersonIds)
                ON BusinessEntityID = value
  -- Equivalent to "WHERE BusinessEntityID IN (1,4,6,9)"

end

OPENJSON will go through the JSON array, return one value for each element and you can JOIN result with Person table. Now, if you want to search for Person rows by list of comma separated ids, you just need to pass CSV list formatted as JSON, e.g.:

SQL
EXEC dbo.GetPersonList '[1,4,6,9]'

If you are concerned about performance, you might look at the following article: OPENJSON - one of best ways to select rows by list of ids. Alternative approaches using XML, Table value parameters can be found here SQL Wizardry Part Four - passing lists of data to SQL Server.

Points of Interest

In this article, I have shown what the capabilities of SQL Server for processing JSON text are. If you need combined data models where you need to store and process both relational and JSON data, this might be a helpful overview of functionalities offered in SQL Server. Most of these functionalities and available in SQL Server 2016 CTP3, and soon they will be available in Azure SQL Database.

History

  • 13th November, 2015: Initial version

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)