DataPower has first class support for JSON message processing through two frameworks: JSONx and JSONiq. They have their pros and their cons.
With JSONx, the message is converted into XML. This encapsulates the input message format from subsequent processing steps and allows the continued use of existing XML tools and experience. JSONx is similar to the masks hiding the identity of the people in the picture above with JSONx being the mask that the JSON is wearing. Data can be easily converted back and forth between the two formats as needed.
With JSONiq, an extension of XQuery, a different syntax for processing the data is introduced but at the benefit of not requring an intermediate format. This can be beneficial if your company is already a strong supporter of XQuery.
For most shops JSONx is the better choice as you convert the “unknown” (JSON) into the “known” (XML).
JSON is an encoding standard for data interchange that is becoming the standard for services, displacing SOAP/XML. DataPower provides two distinct methods of working with JSON: JSONx and JSONiq, an extension of XQuery. With JSONx, the data is converted into XML. With JSONiq, the data can be queried natively. This series will discuss the history of the XML appliance and show examples of how to work with JSON in the IBM DataPower Gateway environment.
For more information, you can read the specification at JSON.org.
SOAP/XML & JSON
When DataPower was initially released, SOAP (XML) was the primary data format for integration between endpoints in the enterprise. It used WSDL contracts and XSD data definitions to construct rigid service interface contracts binding the service provider and consumer tightly to enable successful data exchange. The service provider had to ensure that the contract accurately described the request / response data and the consuming developer had to understand how to properly construct a request message. This was not a straightforward process and led to the introduction of code generator frameworks that intended to abstract the technical details of data exchange. This caused the development effort to be lengthy and costly.
On top that, Web Services specifications (via the WS-* standards) started to grow larger and more complex to accommodate more complex usage scenarios. As these standards were becoming more complex, actual service design was becoming simpler. Most service implementations just wanted to expose a Remote Procedure Call (RPC) to a consumer. When the goal is a transfer of simple data with a data definition that can rapidly evolve over time, then the effort of creating rigid WSDL contracts is overkill and provides little value.
This push to JSON made the DataPower environment evolve as well and allows for working with JSON in two independent models:
- JSONx – Part 1 of this series
- JSONiq – Part 2 of this series (coming soon)
In JSONx, the idea is to convert the JSON data into an XML format. This XML format can be transformed at will and then converted back into JSON as needed.
In JSONiq, the idea is to natively query the JSON without the need for an intermediate format. The tradeoff being that it’s a new syntax for transforming data if you don’t currently use the XQuery standard.
JSONx – JSON represented as XML
Given that DataPower is great at working with XML, wouldn’t it be nice if we could work with JSON in the same way? This way we could use all our current tools and deep knowledge of XML to deal with this new data format. Introducing JSONx, a standard that specifies JSON content as XML.
JSONx provides a few XML elements that can be combined together to represent any JSON data:
- json:object – This represents a JSON object. Objects contain other JSONx objects.
- json:array – This represents a container of an array of objects.
- json:string – This represents a string of Data. The value is enclosed by the element.
- json:number – This represents a JSON Number. The value is enclosed by the element.
- json:boolean – This represents a JSON Boolean. The value is enclosed by the element.
- json:null – This represents a null JSON value.
Every JSONx object also specifies a name attribute. Usage of JSONx is defined by the xml namespace “http://www.ibm.com/xmlns/prod/2009/jsonx”.
<?xml version="1.0" encoding="UTF-8"?>
<json:string name="name">John Smith</json:string>
<json:string name="streetAddress">21 2nd Street</json:string>
<json:string name="city">New York</json:string>
<json:null name="additionalInfo" />
<json:string name="ficoScore"> > 640</json:string>
"streetAddress": "21 2nd Street",
"city": "New York",
"ficoScore": " > 640"
A popular opinion is to take this opportunity to audibly groan at the concept of representing JSON in XML as if it’s an attack on the JSON standard or needlessly ‘enterprising it up’. These people don’t have large enterprise experience and miss the purpose of the standard.
A large cost in IT is how to efficiently integrate anything new into the existing process. The company has already made large financial and personnel investments into supporting the current technology. If new data formats can quickly leverage existing tools and experience, then there is an associated reduction in the cost of adopting the technology. That’s the goal of JSONx, to abstract the details of JSON and covert it into a known commodity, XML. If you have a data power appliance, the chances are your team is already deeply skilled on XML, XPath and XSLT.
XMLFW / MPG Input Type: JSON
On XML Firewalls and Multi Protocol Gateways, the Request and Response type can be set to JSON. This will configure the endpoint to expect JSON as the input data format. The data will be validated for structural JSON correctness and then saved in the INPUT context for request rule processing.
Now, if you attempt to use this INPUT context with a transform action and an XSLT, you will get a parsing error. The INPUT context is JSON and the transform action expects XML. To work with the INPUT context as XML it should be converted into JSONx. This could be implemented as an action to converts the data, but there is a special context created for free when your request/response rule is JSON. This is the ‘__JSONASJSONX‘ context.
‘__JSONASJSONX‘ will be populated with the JSONx representation of the INPUT. Simply switch the input context for your transform action and it will work.
Converting JSON to JSONx
__JSONASJSONX is only created from the INPUT context when the proxy is configured to accept JSON. If later in a processing policy, ‘dp:url-open(..)’ is used to connect to a different JSON service, a JSON response message will be received but how it be converted into JSONx?
This can be accomplished with a ‘Convert Query Params to XML’ action. Within this action, create a new ‘Input Conversion’ and select JSON as the default encoding. The output context of the action will be the corresponding JSONx.
Part one talked about JSONx and it’s value in converting a new data format, JSON, into a known data format, XML. JSONx is the correct choice for 95% of data power environments that want to begin working with JSON. If you are amongst the 5% that don’t want to use an intermediate format, then read part 2 that will discuss JSONiq.
Share this Post