Paginated APIs in Flask

Week 2 of GSoC I had the task of implementing paginated APIs in Open Event project. I was aware that DRF provided such feature in Django so I looked through the Internet to find some library for Flask. Luckily, I didn’t find any so I decided to make my own.

A paginated API is page-based API. This approach is used as the API data can be very large sometimes and pagination can help to break it into small chunks. The Paginated API built in the Open Event project looks like this –

{
    "start": 41,
    "limit": 20,
    "count": 128,
    "next": "/api/v2/events/page?start=61&limit=20",
    "previous": "/api/v2/events/page?start=21&limit=20",
    "results": [
    	{
    		"data": "data"
    	},
    	{
    		"data": "data"
    	}
    ]
}

Let me explain what the keys in this JSON mean –

  1. start – It is the position from which we want the data to be returned.
  2. limit – It is the max number of items to return from that position.
  3. next – It is the url for the next page of the query assuming current value of limit
  4. previous – It is the url for the previous page of the query assuming current value of limit
  5. count – It is the total count of results available in the dataset. Here as the ‘count’ is 128, that means you can go maximum till start=121 keeping limit as 20. Also when you get the page with start=121 and limit=20, 8 items will be returned.
  6. results – This is the list of results whose position lies within the bounds specified by the request.

Now let’s see how to implement it. I have simplified the code to make it easier to understand.

from flask import Flask, abort, request, jsonify
from models import Event

app = Flask(__name__)

@app.route('/api/v2/events/page')
def view():
	return jsonify(get_paginated_list(
		Event, 
		'/api/v2/events/page', 
		start=request.args.get('start', 1), 
		limit=request.args.get('limit', 20)
	))

def get_paginated_list(klass, url, start, limit):
    # check if page exists
    results = klass.query.all()
    count = len(results)
    if (count < start):
        abort(404)
    # make response
    obj = {}
    obj['start'] = start
    obj['limit'] = limit
    obj['count'] = count
    # make URLs
    # make previous url
    if start == 1:
        obj['previous'] = ''
    else:
        start_copy = max(1, start - limit)
        limit_copy = start - 1
        obj['previous'] = url + '?start=%d&limit=%d' % (start_copy, limit_copy)
    # make next url
    if start + limit > count:
        obj['next'] = ''
    else:
        start_copy = start + limit
        obj['next'] = url + '?start=%d&limit=%d' % (start_copy, limit)
    # finally extract result according to bounds
    obj['results'] = results[(start - 1):(start - 1 + limit)]
    return obj

Just to be clear, here I am assuming you are using SQLAlchemy for the database. The klass parameter in the above code is the SqlAlchemy db.Model class on which you want to query upon for the results. The url is the base url of the request, here ‘/api/v2/events/page’ and it used in setting the previous and next urls. Other things should be clear from the code.

So this was how to implement your very own Paginated API framework in Flask (should say Python). I hope you found this post interesting.

Until next time.

Ciao

 

{{ Repost from my personal blog http://aviaryan.in/blog/gsoc/paginated-apis-flask.html }}

Getting Signed Release apk’s from the command line

 

View story at Medium.com

If anyone of you has deployed an application on the play store, you may have most probably used Android Studio’s built in Generate signed apkoption.

The generate apk option in android studio

Recently while making the Open Event Apk generator, I had to make release apk’s, so that they could be used by an event organiser to publish their app, plus apk’s had to be signed because if they were not signed, it would be impossible to upload due to checks by Google.

Error shown on the developers console

So since I was building the app using the terminal and I didn’t have the luxury of signing the app using Android studio and I had to look for alternatives. Luckily I found two of them :

  1. Using the Signing configs offered by gradle
  2. Using the Oracle sun jarsigner

First of all the signing configs in gradle is a great way to do this. Most Open source apps use this as a way to put their code out for everyone to view and sucessfully hide any private keys and password.

You just need to add few lines of code in your app level build.gradle file and create a file called keystore.properties

In your keystore.properties, we just need to store the sensitive info and this file will be accessible only to people who are part of the project.

storePassword=myStorePassword
keyPassword=mykeyPassword
keyAlias=myKeyAlias
storeFile=myStoreFileLocation

Next we go to the build.gradle and add these lines to read the keystore.properties file and it’s variables

// Create a variable called keystorePropertiesFile, and initialize it to your
// keystore.properties file, in the rootProject folder.
def keystorePropertiesFile = rootProject.file("keystore.properties")

// Initialize a new Properties() object called keystoreProperties.
def keystoreProperties = new Properties()

// Load your keystore.properties file into the keystoreProperties object.
keystoreProperties.load(new FileInputStream(keystorePropertiesFile))

Next we can add the signingConfigs task and reference the values we got above over there

android {
    signingConfigs {
        config {
            keyAlias keystoreProperties['keyAlias']
            keyPassword keystoreProperties['keyPassword']
            storeFile file(keystoreProperties['storeFile'])
            storePassword keystoreProperties['storePassword']
        }
    }
    ...
  }

So As you see this is as simple as this but according to my requirements this seemed a bit tedious since a person setting up the apk generator had to make a keystore file, then find the build.gradle and change the path of the keystore file according to the server directories. So this does the trick but this can be so tedious for someone with no technical experience, so I researched on other solutions and then I got it : Jarsigner and Zipalign

First of all,the jarsigner and zipalign are 2 great tools and the best part about them is that both of them work perfectly with a just one line commands. For signing the app :

jarsigner -keystore <keystore_file> -storepass <storepassword> <apknameTosigned> <alias>

and then zipaligning :

zipalign -v 4 <unaligned-apk-location> <path-to-generated-aligned-apk>

So this is it, we finally used these 2 commands to sign and zipalign an apk and it works perfectly fine. Please test and share comments of the demo live @ http://192.241.232.231. Ciao !

Migrations make us crazy!

Our Open Event team is not small, so problems with migrations occur very often. I’d like to describe how to solve these common problems and avoid Contributors frustration. Because more of us didn’t know how to solve this problems at the beginning so we wasted a lot of time to find a source of problem.

The most common mistake is that we forget run migration on Heroku. Developer is sometimes surprised because something works for him but he forgets to run migration on server. These small mistakes lead to huge problems and at that time our app throws a lots of bugs related to database. We can often see “Internal server error”.Screen Shot 2016-06-17 at 22.55.25.png So if developer changes table he has to run migration!

vagrant@vagrant-ubuntu-trusty-64:/vagrant$ python manage.py db migrate

But above command Quite often doesn’t solve our problem, because We face other problems while updating DB for example

alembic.util.exc.CommandError: Multiple head revisions are present for given argument ‘head’; please specify a specific target revision, ‘<branchname>@head’ to narrow to a specific head, or ‘heads’ for all heads

this problem is caused by two developers which push code to Github with migrations without merging two heads to achieve one head.

So to solve this problem you only have to know ids of two heads

vagrant@vagrant-ubuntu-trusty-64:/vagrant$ python manage.py db heads

e38935822969 (head)
f55fde3d62b1 (head)

Then you have to merge it

vagrant@vagrant-ubuntu-trusty-64:/vagrant$ python manage.py db merge e38935822969 f55fde3d62b1

Generating /vagrant/migrations/versions/ecb671d1eb4b_.py … done

Upgrade DB

vagrant@vagrant-ubuntu-trusty-64:/vagrant$ python manage.py db upgrade

INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
INFO [alembic.runtime.migration] Running upgrade 00ea66754d06 -> d84f976530e1, empty message
INFO [alembic.runtime.migration] Running upgrade d84f976530e1 -> 1b3e4f5f56bd, empty message
INFO [alembic.runtime.migration] Running upgrade 1b3e4f5f56bd -> e38935822969, empty message
INFO [alembic.runtime.migration] Running upgrade e38935822969, f55fde3d62b1 -> ecb671d1eb4b, empty message

And finally run migrations

vagrant@vagrant-ubuntu-trusty-64:/vagrant$ python manage.py db migrate
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
INFO [alembic.ddl.postgresql] Detected sequence named ‘role_id_seq’ as owned by integer column ‘role(id)’, assuming SERIAL and omitting
INFO [alembic.ddl.postgresql] Detected sequence named ‘microlocation_id_seq’ as owned by integer column ‘microlocation(id)’, assuming SERIAL and omitting….

Open Event Organizer’s Server: Custom Forms

As we know, open event organizer’s server provides a means of creating events. And an important part of events is sessions and speakers of sessions. But different events have different details required for sessions and speakers of sessions. Some might feel that only Name and Email-ID of speaker is enough, while others may require their Github, Facebook, etc. links. So the best way to solve this is to make a custom form so that the organizers get to select what all fields they want in the forms. But how to do this? Well, thanks to JSON

In the Step-5 of Event Creation Wizards, we have option (or switches) to include and require various fields available for the session form and speaker form for applying in a particular Session Form. Here is how it looks:
custom_form.png

As we can see, each field has 2 options – Include and Require. On clicking the Include switch the Require switch is enabled through jQuery. The Include switch means that the field is included in the form while the Require switch signifies that the field will be a compulsory field in the form.

Storing Options

We create a javascript array containing a single object. This object in turn is made up of objects with field names as keys and another object as value. This object in turn contains the keys include and require. The value of include and require is 0 by default. On selecting, their value is changed to 1. An example structure of the array is like:

[{
    "title": {
        "include": 1,
        "require": 1
    },
    "subtitle": {
        "include": 0,
        "require": 0
    },
    "short_abstract": {
        "include": 1,
        "require": 0
    },
    "long_abstract": {
        "include": 0,
        "require": 0
    },
    "comments": {
        "include": 1,
        "require": 0
    },
    "track": {
        "include": 0,
        "require": 0
    },
    "session_type": {
        "include": 0,
        "require": 0
    },
    "language": {
        "include": 0,
        "require": 0
    },
    "slides": {
        "include": 1,
        "require": 0
    },
    "video": {
        "include": 0,
        "require": 0
    },
    "audio": {
        "include": 0,
        "require": 0
    }
}]

This array is then converted to string format using JSON.stringify, submitted through form and stored in database in the form of a string.

Rendering Forms

Now, we have already stored the options as selected by the organizer. But now we need to render the forms based on the JSON string stored in the database. Firstly, in the server side we convert the string to JSON object in python using json.loads . Then we send these JSON objects to the templates.

In the templates, we create form elements based on the values of the include and require of each form element. A sample element HTML is like this:

Screenshot from 2016-06-17 22:38:40

We use a loop to iterate over the JSON object and thus add the elements which have  “include = 1”  . The final form looks something like this:

session-form

Errors and Error handlers for REST API

Errors are an essential part of a REST API system. Error instances must follow a particular structure so the client developer can correctly handle them at the client side. We had set a proper error structure at the beginning of creating REST APIs. It’s as follows:

{
    "error": {
        "message": "Error description",
        "code": 400
    }
}

Any error occurring during client server communication would follow the above format. Code is the returned status code and message is a brief description of the error. To raise an error we used an _error_abort() function which was an abstraction over Flask-RESTplus abort(). We defined an error structure inside _error_abort() and passed it to abort().

def _error_abort(code, message):
    error = {
        'code': code,
        'message': message,
    }
    abort(code, error=error)

This method had its limitations. Since the error handlers were not being overidden, only errors raised through _error_abort() had the defined structure. So if an Internal Server error occurred, the returned error response wouldn’t follow the format.

To overcome this, we wrote our own exceptions for errors and created error handlers to handle them. We first made the response structure more detailed, so the client developer can understand what kind of error is being returned.

{
    "error": {
        "code": 400,
        "message": "'name' is a required parameter",
        "status": "INVALID_FIELD",
        "field": "name"
    }
}

The above is an example of a validation error.

The “status” key helps making the error more unique. For example we use 400 status code for both validation errors and invalid service errors (“Service not belonging to said event”). But both have different statuses: “INVALID_FIELD” and “INVALID_SERVICE”.

The “field” key is only useful in the case validation errors, where it names the field that did not pass the validation. For other errors it remains null.

I first documented what kind of errors we would need.

Code Status Field Description
401 NOT_AUTHORIZED null Invalid token or user not authenticated
400 INVALID_FIELD “field_name” Missing or invalid field
400 INVALID_SERVICE null Service ID mentioned in path does not belong to said Event
404 NOT_FOUND null Event or Service not found
403 PERMISSION_DENIED null User role not allowed to perform such action
500 SERVER_ERROR null Internal server error

Next part was creating exception classes for each one these. I created a base error class that extended the python Exception class.

class BaseError(Exception):
    """Base Error Class"""

    def __init__(self, code=400, message='', status='', field=None):
        Exception.__init__(self)
        self.code = code
        self.message = message
        self.status = status
        self.field = field

    def to_dict(self):
        return {'code': self.code,
                'message': self.message,
                'status': self.status,
                'field': self.field, }

The to_dict() method would help when returning the response in error handlers.

I then extended this base class to other error classes. Here are three of them:

class NotFoundError(BaseError):
    def __init__(self, message='Not found'):
        BaseError.__init__(self)
        self.code = 404
        self.message = message
        self.status = 'NOT_FOUND'


class NotAuthorizedError(BaseError):
    def __init__(self, message='Unauthorized'):
        BaseError.__init__(self)
        self.code = 401
        self.message = message
        self.status = 'NOT_AUTHORIZED'


class ValidationError(BaseError):
    def __init__(self, field, message='Invalid field'):
        BaseError.__init__(self)
        self.code = 400
        self.message = message
        self.status = 'INVALID_FIELD'
        self.field = field

I then defined the error handlers for the api:

@api.errorhandler(NotFoundError)
@api.errorhandler(NotAuthorizedError)
@api.errorhandler(ValidationError)
@api.errorhandler(InvalidServiceError)
def handle_error(error):
    return error.to_dict(), getattr(error, 'code')

For overriding the default error handler, Flask-RESTplus let’s you create one with the same decorator, but without passing and argument to it.

@api.errorhandler
def default_error_handler(error):
    """Returns Internal server error"""
    error = ServerError()
    return error.to_dict(), getattr(error, 'code', 500)

I had set the default error to be the internal server error.

class ServerError(BaseError):
    def __init__(self, message='Internal server error'):
        BaseError.__init__(self)
        self.code = 500
        self.message = message
        self.status = 'SERVER_ERROR'

Now raising any of these error classes would activate the error handlers and a proper response would be sent to the client.

Open Event Apk generator

So we made this apk generator currently hosted on a server (http://192.241.232.231) which let’s you generate an android app for your event in 10 minutes out of which the server takes about 8 minutes to build 😛 . So, essentially you just have to spare 2 minutes and just enter 3 things(email, Desired app’s name and Api link). Isn’t this cool?

So how exactly do we do this?

At the backend, we are running a python scripts with some shell scripts where the python script is basically creating directories, interacting with our firebase database to get the data entered by a user. So we made these scripts to first of all to clone the open event android repo, then customise and change the code in the repo according to the parameters entered by the organiser earlier(shown in the image).

Screen Shot 2016-06-14 at 12.13.12 AM
Generator Website

After the code has been changed by the scripts to customise the app according to the event the app will be used for, we move on to the script to build the apk, where we build and sign the apk in release mode using signing configs and release keys from the server because we don’t the organiser to generate keys and store it on the server to avoid the hassle and also the privacy concerns involving the keys. So this is when the apk is generated on the server. Now you have questions like the apk is generated but how do I get it? Do I have to wait on the same page for 10 minutes while the apk is being sent? The answer is no and yes. What I mean by this is that you can wait to download the apk if you want but we’ll anyways send it to your email you specified on the apk generator website. This is the sample Email we got when we were testing the system

Screen Shot 2016-06-14 at 12.08.59 AM.png

So it’s an end to end complete solution from which you can easily get an android app for your event in just 2 minutes. Would love to hear feedback and reviews. Please feel free to contact me @ manan13056@iiitd.ac.in or on social media’s(Twitter, Facebook). Adios!

P.S. : Here is the link to the scripts we’re using.

Why should you use Heroku?

Last week I’ve dedicated most time to implement new heroku features to Open Event. Due to the fact that wasn’t easy tasks I can share my experience.

What is heroku?

Heroku is a cloud Platform-as-a-Service (PaaS) supporting several programming languages like Java, Node.js, Scala, Clojure, Python, PHP, and Go

Easy to deploy

what you need to do:

  1. Create account on heroku
  2. Download a heroku toolbelt to your enviroment.
  3. Go to your project directory(/open-event-orga-server)
  4. Sign in to heroku in your command line using your credentials
    $ heroku login
  5. Create app with name $
    heroku apps:create your_app_name

    after execution above command you will recive a link to your application

    http://your_app_name.herokuapp.com/
  6. Push latest changes to heroku
    $ git push heroku master
  7. If everythings is ok you can check results http://your_app_name.herokuapp.com/ (sometimes it does not work like we want 🙁 )

Easy to configure

To list/set/get config variables use:

$ heroku config
$ heroku config:set YOUR_VARIABLE=token123
$ heroku config:get YOUR_VARIABLE

or you can go to you application dashboard and make above operations

How you can get access to this variables using python langauges?

$ python

Python 2.7.10 (default, Oct 23 2015, 19:19:21) 

[GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.0.59.5)] on darwin

Type "help", "copyright", "credits" or "license" for more information.

>>> import os

>>> your_variable  = os.environ.get('YOUR_VARIABLE', None)

>>> print your_variable
token123

I’ve used this to display current release in my python application(You need to generate a special API token and add it to config variables)

os.popen('curl -n https://api.heroku.com/apps/open-event/releases -H 
"Authorization: Bearer ' + token + '" -H 
"Accept: application/vnd.heroku+json; version=3"').read()

Easy to monitor

If something is wrong with your APP you need to use this command

$ heroku logs

it shows all logs

To see 10 latest releases use:

$ heroku releases

How you can set up Open Event to deploy to heroku?

  1. Clone https://github.com/fossasia/open-event-orga-server
  2. Go to directory of open event orga server(/open-event-orga-server)
  3. Add git remote
     heroku git:remote -a open-event
  4. You can check if open event is added to git remote
    $ git remote -v
    heroku https://git.heroku.com/open-event.git (fetch)
    heroku https://git.heroku.com/open-event.git (push)
    origin https://github.com/fossasia/open-event-orga-server.git (fetch)
    origin https://github.com/fossasia/open-event-orga-server.git (push)
  5. Now you can deploy changes to open-event application(You need a permissions 🙂 )

Why should you use a Heroku?

It’s great to deploy apps because you are able to share content in short time what I’ve done. Besides it’s very well documented so you can find there answers for most of your questions. Finally most of things you can configure using Heroku dashboard so it’s the best advantages of this tool.

Better fields and validation in Flask Restplus

We at Open Event Server project are using flask-restplus for API. Apart from auto-generating of Swagger specification, another great plus point of restplus is how easily we can set input and output models and the same is automatically shown in Swagger UI. We can also auto-validate the input in POST/PUT requests to make sure that we get what we want.

@api.expect(EVENT_POST, validate=True)
def put(self, id):
    """Modify object at id"""
    pass

As can be seen above, the validate param for namespace.expect decorator allows us to auto-validate the input payloads. This used to work well until one day I realized there were a few problems.

  1. When a field was defined as say for example field.Integer, then it will accept only Integer values, not even null.
  2. If there is a string field and it has required param set to True, then also it is possible to set empty string as its value and the in-built validator won’t catch it.
  3. Even if I somehow managed to hack my way to support null in field, it will also support null even if required=True.
  4. We had no control on what error message was returned.
EVENT = api.model('Event', {
    'id': fields.Integer,
    'name': fields.String(required=True)
})

Specially problem #1 was a huge one as it questioned the whole foundation of the API. So we realized it will be better if we don’t use namespace.expect and use a custom validator. For custom validator, we first had to create custom fields that this validator can benefit from. Luckily flask-restplus comes with a great API for creating custom fields. So we quickly created custom fields for all common fields (Integer, String) and more specific fields like Email, Uri and Color. Creating these specific fields were a huge advantage as now we can show proper example for each field types in the Swagger UI.

class Email(fields.String):
    """
    Email field
    """
    __schema_type__ = 'string'
    __schema_format__ = 'email'
    __schema_example__ = 'email@domain.com'

Consider the above code; now when we use Email as a field for a value, then the example shown for it in Swagger UI will be ‘email@domain.com’. Quite cool, right?

Now we needed a way to validate these fields. For that, what we did was to create a validate method in each of the field-classes. This validate method would get the value and check if it was valid. Consider the following code –

import re
EMAIL_REGEX = re.compile(r'S+@S+.S+')

class Email():
	def validate(self, value):
		if not value:
		    return False if self.required else True
		if not EMAIL_REGEX.match(value):
		    return False
		return True

Once each of the field had their validate methods, we created a validate_payload() function that uses the API model and compares it with the payload. It will first check if all required keys are present in the payload or not. When that is true, it finally validates each field’s value using their field’s classvalidate method.

from flask import abort
from flask_restplus import fields
from custom_fields import CustomField

def validate_payload(payload, api_model):
    # check if any reqd fields are missing in payload
    for key in api_model:
        if api_model[key].required and key not in payload:
            abort(400, 'Required field '%s' missing' % key)
    # check payload
    for key in payload:
        field = api_model[key]
        if isinstance(field, fields.List):
            field = field.container
            data = payload[key]
        else:
            data = [payload[key]]
        if isinstance(field, CustomField) and hasattr(field, 'validate'):
            for i in data:
                if not field.validate(i):
                    abort(400, 'Validation of '%s' field failed' % key)

The CustomField is the base class that each of the custom fields mentioned above inherit. So checking if field was an instance of CustomField is enough to know if it is a custom field or not. Other thing that may look weird in the above code is use of fields.List. If you look closely, I have added this to support custom fields inside lists. So if you have used a custom field in a list, it will also work too. But obviously, this only supports single level lists for now. The thing is we didn’t needed more than that so I let it go. :stuck_out_tongue_winking_eye:

This basically sums up how we are validating input payloads at Open Event. Of course this is very basic but we will keep on improving it as the project progresses. Stay tuned to opev blog if you want to be in touch with the progress of the project.

Links to full code at the time of writing this post are –

  1. Custom Fields
  2. Validate Payload

I hope you found this post useful. Thanks for reading.

 

{{ Repost from my personal blog http://aviaryan.in/blog/gsoc/restplus-validation-custom-fields.html }}

Export Timeline as iCal

iCal or iCalendar is the Internet standard for sharing event schedules or session timelines among each other. The filename extension for iCal is .ics . It is supported by a number of applications such as Google Calendar, Apple Calendar,] Yahoo! Calendar, Lightning extension for Mozilla Thunderbird and SeaMonkey, and partially by Microsoft Outlook . In Open Event Organizer’s Server we have added a feature to export the Schedule of a particular Event in iCal format.

Continue reading “Export Timeline as iCal”

Swagger

Swagger is a specification for describing REST APIs. The main aim of Swagger is to provide a REST API definition format that is readable by both machines and humans.

You can think of two entities in a REST API: One the provider of API, and another the client using the API. Swagger essentially covers the gap between them by providing a format that is easy to use by the client and easy for the provider to define.

If not using Swagger, one would most certainly be creating the API first, writing the documentation (human-readable) with it. The client developer would read the documentation and use APIs as required. With Swagger, the specification can be considered as the document itself, helping both the client developer and the provider.

Here’s an example spec I wrote for our GET APIs at Organizer Server: https://gist.github.com/shivamMg/dacada0b45585bcd9cd0fbe4a722eddf

The format, although readable doesn’t really look what a client developer would be asking for.

Remember that the format is machine readable? What does it mean exactly?

Since the Swagger spec is a defined format, the provider can document it and people can write programs that understand the Swagger specification. Swagger itself comes with a set of tools (http://swagger.io/tools/) that use Swagger definitions created by the API provider to create SDKs for the clients to use.

Swagger-UI

One of our most used tools at our server is the Swagger UI (http://swagger.io/swagger-ui/).

It reads an API spec written in Swagger to generate corresponding UI that people can use to explore the APIs. Every API endpoint can have responses and parameters associated with it. For instance our Event endpoint at Server (“/events/:event_id”).

You can see how the UI displays the Model Schema, required parameters and possible response message.

Screenshot from 2016-06-07 19:08:21

Apart from documentation, Swagger UI provides the “Try it out!” tool that lets you make requests to the server for the corresponding API. This feature is incredibly useful for POST requests. No need for long curl commands in the terminal.

Here’s the Swagger config from our demo application: https://open-event.herokuapp.com/api/v2/swagger.json

The Swagger UI for this config can be found at https://open-event.herokuapp.com/api/v2

The UI for the example spec (gist) I linked before can be browsed here.

Swagger-js

https://github.com/swagger-api/swagger-js

Swagger-js is JS library that reads an API spec written in Swagger and provides an interface to the client developer to interact with the API. We will be using Swagger-js for Open Event Webapp.

Here’s an example to show you how it works. Let’s say the following endpoint returns (json) a list of events.

/events

The client developer can create a GET request and render the list with HTML.

$.getJSON("http://example.com/api/v2/events", function(data) {

  var events = "";
  $.each(data, function(i, event) {
    events += "<li id='event_" + i + "'>" + event.name + "</li>";
  });

  $("ul#events").html(events);
});

What if the provider moves the endpoint to “/event/all”? The client developer would need to change every instance of the URL to http://example.com/event/all.

Let’s now take the case of Swagger-js. With Swagger-js the client developer would essentially be writing programs to interact with the API Swagger spec defined by the provider, instead of directly consuming the API.

window.client = new SwaggerClient({
    url: "http://example.com/api/v2/swagger.json",
    success: function() {
      client.event.getEvents({
        responseContentType: 'application/json'
      }, function(data) {

        /* Create `events` string same as before */

        $("ul#events").html(events);
      });
    }
  });

 

The APIs at the Organizer server are getting more complex. Swagger helps in keeping the spec well defined.

That’s all for now. Hope you enjoyed the read.