In the first part of this two part blog post, I will describe the use case for which we used Oracle API Platform Cloud Service (shortened to APIPCS from now on) as an outbound API Gateway, offering lookup services and centralizing outbound access. This part will also describe a simple Node.js microservice we deployed to Oracle’s Application Container Cloud Service to perform the translation of an internal identification into the API key required by the external API.
The second part of this blog will elaborate on the actual creation of the API definition in APIPCS, the invocation of the microservice using Groovy scripting and the testing of the external API using APIPCS.
Introduction
At a customer, for a proof of concept last year, I have been working with an exiting new Cloud Product from Oracle: API Platform Cloud Service, or APIPCS for short. In this blog, I will not go into architectural details on why you could, should or must use an API Manager of API Gateway. Nor will I be elaborating on all functionality or features of Oracle’s implementation of the product, since they’re perfectly capable of doing that themselves.
Basic Scenario
In the use case encountered at this specific customer, the requirement existed to not use the API Gateway as an inbound Gateway, but rather to use the API Management solution as an outbound gateway. The rationale for this setup is to provide a single outbound gateway through which REST services can be invoked where internal identifications can be translated into different kinds of external identifications, known to the outside world. The former may be thought of as (internal) applications, or perhaps even customer identifications stored in an internal system. The latter category then may relate to API-keys required by the external partner, or even OAuth2 tokens.
The advantage of this outbound API gateway may be seen in the centralization of the translation of the identifications/external credentials and the fact that all applications may use the same outbound gateway in the company’s outbound DMZ.
Scenario using API Platform Cloud Service
The specific scenario we have implemented is the invocation of a key-secured, external API through an Oracle Cloud-based APIPCS instance from within the corporate domain.
So, the call is coming from within the corporate IT-domain; this could be an application running on a server within the corporate domain, like a CRM-application, but it might also be a web server proxying a request from an external user or application. Whatever the source, the API call itself is generated from within the corporate network domain and is invoking an API endpoint exposed on the API Platform Cloud Solution, which is running in the Oracle Cloud. For this POC, we´re running the Gateway inside the Oracle Cloud, but the product also allows you to only run the management service inside the cloud and run the gateways on premise, inside your DMZ.
Upon receiving a call, the API Gateway – that was co-located with the API Management Service for this proof of concept- running in the Oracle Cloud, processes the request through all ‘policies’ and invokes the service implementation, using the payload and headers from the request pipeline. In this case, the service’s implementation is located somewhere on the GBI (“Grote Boze Internet”). The response from the service implementation is processed through all policies defined for the response path and returned to the consumer.
Use Case
For this proof of concept, the use case was providing the Chamber of Commerce information (‘Kamer van Koophandel’) as offered by overheid.io. As can be seen from the documentation, invoking this API requires an active (free) subscription identified by an API key (‘ovio-api-key’) passed as an HTTP Header, a query parameter or as a name-value pair in the body: which option is available depends on the HTTP method used.
The API Platform Cloud Service offering that was available to us at the time of the POC did not (yet?) have the possibility of looking up API keys. As the specific use case required, we needed to show that we could map an internal identifier to the required API-key, where the possibility existed that multiple internal identifiers mapped to the same API key.
As it turned out, the APIPCS did not yet have the possibility of mapping these internal identifiers, but it did offer a way to perform a service callout to perform data validation. It takes some cunningness to squeeze out the information required as this callout is intended to do basic validation. For this use case we required a microservice to perform credential lookups.
Credential Lookup Microservice
The microservice to perform the translation has three different operations:
HTTP Method | URL | Action |
GET | /apikey/{id} | Returns a JSON object containing the apikey if present (HTTP 200), in case the requested id does not exist an HTTP 404 will be returned with an error. |
DELETE | /apikey/{id} | If the apikey is present, the stored key will be removed with a status message (HTTP 200), in case the requested id does not exist an HTTP 404 will be returned with an error. |
POST | /apikey | The apikey from the request body will be stored, using the id in the request body as the key. |
The API payload contains the internal identifier (id) and the API key (apikey) that secures the external service.
The microservice was implemented in Node.js using the Express framework and, being a simple POC, file system persistence using node-persist.
Node.js is an fast and scalable server-side JavaScript platform based on Google’s V8 engine and is available as a runtime at major cloud providers, like Heroku (see my previous blog https://www.syntouch.nl/first-steps-cloud-based-js-development), Google’s AppEngine, Amazon’s Elastic Beanstalk and our platform of choice, Oracle Application Container Cloud.
Service Dependencies
For Node.js applications, the dependencies are defined in a file called package.json, located at the root of the project’s file structure for the Node Package Manager (npm) to process:
Package.json
{ "name": "nodejs-apikey", "version": "1.0.0", "dependencies": { "body-parser": ">1.10.0", "node-persist": "~0.0.11", "express": ">4.0.0" } }
Service Implementation
The service implementation for this utility service is very small, so I took the short cut of implementing this in a single source code file. It consists of a series of request handlers that are attached to specific routes; these routes correspond to the HTTP method and URL combination described above.
Apikey.js
var express = require('express'); var bodyParser = require('body-parser'); var storage = require('node-persist'); var app = express(); var PORT = process.env.PORT || 80 ; app.listen(PORT, function(){ console.log('Application is running and listening on port ' + PORT); }); storage.initSync({ dir: 'storage'}); /* Accepting any type and assuming it is application/json, otherwise the caller * is forced to pass the content-type specifically. * */ function defaultContentTypeMiddleware (req, res, next) { req.headers['content-type'] = req.headers['content-type'] || 'application/json'; next(); } /* Handler for the storage of a new apikey for the specified id: Sample payload: { "id" : "MeMyselfAndI" , "apikey" : "OpenSesame" } */ function handlepostkey(req, res){ var response = {}; // Store the apikey based on the supplied id if (req.body.apikey != ''){ var newKey = {}; newKey.apikey = req.body.apikey; storage.setItem( req.body.id, newKey); res.statusCode = 201; response.status = 'key successfully stored' res.json( response ); console.log('Stored apikey for id: ' + req.body.id); } else { console.log('Cannot store key from request: ' + req.body); response.error = 'No key supplied'; res.statusCode = 404; res.json( response ); } } /* Handler for the retrieval of the APIKey for id as in GET /apikey/{id} Response will be { "apikey" : "APIKEYVALUE" } + HTTP/200 when found errormessage and HTTP/404 otherwise */ function handlegetkey(req, res){ var id = req.params.id; var obj = storage.getItem( id ); var response = {}; if (obj != null){ response.apikey = obj.apikey; res.statusCode = 200; res.json(response); console.log('Retrieved key for id: ' + id); } else { response.error = 'No key found'; res.statusCode = 404; res.json( response ); console.log('Cannot retrieve key for id: ' + id); } } function handledeletekey(req,res){ var id = req.params.id; storage.removeItemSync(id); res.statusCode = 200; var response = {}; response.status = 'APIKey was removed from storage'; res.json( response); console.log('Removal operation completed'); } // initialize middleware app.use(defaultContentTypeMiddleware); app.use(bodyParser.json({ type: '*/*' })); // map API routes to handler functions app.get('/apikey/:id', handlegetkey); app.delete('/apikey/:id', handledeletekey); app.post('/apikey', handlepostkey);
Now, for this application to run successfully inside the Oracle Application Container Cloud (O-ACC), an additional manifest file is required, defining the Node.js runtime needed as well as the command to start the app.
Manifest.json
{ "runtime" : { "majorVersion" : "0.12" } , "command" : "node apikey.js" , "release" : {} , "notes" : "Simple microservice to store and retrieve APIKEYs based on id" }
The HTTP Post API request below shows how an API key can be (persistently) stored where the internal identifier is taken to be an email address (using the Chrome Postman application):
Using an HTTP GET request, specifying the internal key as an request parameter, the stored API key is retrieved and returned as the response body:
Deployment
As a bit of a disappointment, in order to deploy the application to Oracle’s ACC, you not only need to ZIP all your application code and declarations, but also all of your application’s dependencies. As this step is still present in Oracle’s tutorial, I am assuming that this situation has not changed – but I’d like to be proven wrong!