Keeping Order of tickets in Event Wizard in Sync with API on Open Event Frontend

This blog article will illustrate how the various tickets are stored and displayed in order the event organiser decides  on  Open Event Frontend and also, how they are kept in sync with the backend.

First we will take a look at how the user is able to control the order of the tickets using the ticket widget.

{{#each tickets as |ticket index|}}
  {{widgets/forms/ticket-input ticket=ticket
  timezone=data.event.timezone
  canMoveUp=(not-eq index 0)
  canMoveDown=(not-eq ticket.position (dec
  data.event.tickets.length))
  moveTicketUp=(action 'moveTicket' ticket 'up')
  moveTicketDown=(action 'moveTicket' ticket 'down')
  removeTicket=(confirm 'Are you sure you  wish to delete this 
  ticket ?' (action 'removeTicket' ticket))}}
{{/each}}

The canMoveUp and canMoveDown are dynamic properties and are dependent upon the current positions of the tickets in the tickets array.  These properties define whether the up or down arraow or both should be visible alongside the ticket to trigger the moveTicket action.

There is an attribute called position in the ticket model which is responsible for storing the position of the ticket on the backend. Hence it is necessary that the list of the ticket available should always be ordered by position. However, it should be kept in mind, that even if the position attribute of the tickers is changed, it will not actually change the indices of the ticket records in the array fetched from the API. And since we want the ticker order in sync with the backend, i.e. user shouldn’t have to refresh to see the changes in ticket order, we are going to return the tickets via a computed function which sorts them in the required order.

tickets: computed('data.event.tickets.@each.isDeleted', 'data.event.tickets.@each.position', function() {
   return this.get('data.event.tickets').sortBy('position').filterBy('isDeleted', false);
 })

The sortBy method ensures that the tickets are always ordered and this computed property thus watches the position of each of the tickets to look out for any changes. Now we can finally define the moveTicket action to enable modification of position for tickets.

moveTicket(ticket, direction) {
     const index = ticket.get('position');
     const otherTicket = this.get('data.event.tickets').find(otherTicket => otherTicket.get('position') === (direction === 'up' ? (index - 1) : (index + 1)));
     otherTicket.set('position', index);
     ticket.set('position', direction === 'up' ? (index - 1) : (index + 1));
   }

The moveTicket action takes two arguments, ticket and direction. It temporarily stores the position of the current ticket and the position of the ticket which needs to be swapped with the current ticket.Based on the direction the positions are swapped. Since the position of each of the tickets is being watched by the tickets computed array, the change in order becomes apparent immediately.

Now when the User will trigger the save request, the positions of each of the tickets will be updated via a PATCH or POST (if the ticket is new) request.

Also, the positions of all the tickets maybe affected while adding a new ticket or deleting an existing one. In case of a new ticket, the position of the new ticket should be initialised while creating it and it should be below all the other tickets.

addTicket(type, position) {
     const salesStartDateTime = moment();
     const salesEndDateTime = this.get('data.event.startsAt');
     this.get('data.event.tickets').pushObject(this.store.createRecord('ticket', {
       type,
       position,
       salesStartsAt : salesStartDateTime,
       salesEndsAt   : salesEndDateTime
     }));
   }

Deleting a ticket requires updating positions of all the tickets below the deleted ticket. All of the positions need to be shifted one place up.

removeTicket(deleteTicket) {
     const index = deleteTicket.get('position');
     this.get('data.event.tickets').forEach(ticket => {
       if (ticket.get('position') > index) {
         ticket.set('position', ticket.get('position') - 1);
       }
     });
     deleteTicket.deleteRecord();
   }

The tickets whose position is to be updated are filtered by comparison of their position from the position of the deleted ticket.

Resources

Implementing Order Statistics API on Tickets Route in Open Event Frontend

The order statistics API endpoints are used to display the statistics related to tickets, orders, and sales. It contains the details about the total number of orders, the total number of tickets sold and the amount of the sales. It also gives the detailed information about the pending, expired, placed and completed orders, tickets, and sales.

This article will illustrate how the order statistics can be displayed using the Order Statistics API in Open Event Frontend. The primary end point of Open Event API with which we are concerned with for statistics is

GET /v1/events/{event_identifier}/order-statistics

First, we need to create a model for the order statistics, which will have the fields corresponding to the API, so we proceed with the ember CLI command:

ember g model order-statistics-tickets

Next, we need to define the model according to the requirements. The model needs to extend the base model class. The code for the model looks like this:

import attr from 'ember-data/attr';
import ModelBase from 'open-event-frontend/models/base';

export default ModelBase.extend({
  orders  : attr(),
  tickets : attr(),
  sales   : attr()
});

As we need to display the statistics related to orders, tickets, and sales so we have their respective variables inside the model which will fetch and store the details from the API.

Now, after creating a model, we need to make an API call to get the details. This can be done using the following:

return this.modelFor('events.view').query('orderStatistics', {});

Since the tickets route is nested inside the event.view route so, first we are getting the model for event.view route and then we’re querying order statistics from the model.

The complete code can be seen here.

Now, we need to call the model inside the template file to display the details. To fetch the total orders we can write like this

{{model.orders.total}}

 

In a similar way, the total sales can be displayed like this.

{{model.sales.total}}

 

And total tickets can be displayed like this

{{model.tickets.total}}

 

If we want to fetch other details like the pending sales or completed orders then the only thing we need to replace is the total attribute. In place of total, we can add any other attribute depending on the requirement. The complete code of the template can be seen here.

The UI for the order statistics on the tickets route looks like this.

Fig. 1: The user interface for displaying the statistics

The complete source code can be seen here.

Resources:

Implementing Pages API in Open Event Frontend

The pages endpoints are used to create static pages which such as about page or any other page that doesn’t need to be updated frequently and only a specific content is to be shown. This article will illustrate how the pages can be added or removed from the /admin/content/pages route using the pages API in Open Event Frontend. The primary end point of Open Event API with which we are concerned with for pages is

GET /v1/pages

First, we need to create a model for the pages, which will have the fields corresponding to the API, so we proceed with the ember CLI command:

ember g model page

Next, we need to define the model according to the requirements. The model needs to extend the base model class. The code for the page model looks like this:

import attr from 'ember-data/attr';
import ModelBase from 'open-event-frontend/models/base';

export default ModelBase.extend({
  name        : attr('string'),
  title       : attr('string'),
  url         : attr('string'),
  description : attr('string'),
  language    : attr('string'),
  index       : attr('number', { defaultValue: 0 }),
  place       : attr('string')
});

As the page will have name, title, url which will tell the URL of the page, the language, the description, index and the place of the page where it has to be which can be either a footer or an event.

The complete code for the model can be seen here.

Now, after creating a model, we need to make an API call to get and post the pages created. This can be done using the following:

return this.get('store').findAll('page');

The above line will check the store and find all the pages which have been cached in and if there is no record found then it will make an API call and cache the records in the store so that when called it can return it immediately.

Since in the case of pages we have multiple options like creating a new page, updating a new page, deleting an existing page etc. For creating and updating the page we have a form which has the fields required by the API to create the page.  The UI of the form looks like this.

Fig. 1: The user interface of the form used to create the page.

Fig. 2: The user interface of the form used to update and delete the already existing page

The code for the above form can be seen here.

Now, if we click the items which are present in the sidebar on the left, it enables us to edit and update the page by displaying the information stored in the form and then the details be later updated on the server by clicking the Update button. If we want to delete the form we can do so using the delete button which first shows a pop up to confirm whether we actually want to delete it or not. The code for displaying the delete confirmation pop up looks like this.

<button class="ui red button" 
{{action (confirm (t 'Are you sure you would like to delete this page?') (action 'deletePage' data))}}>
{{t 'Delete'}}</button>

 

The code to delete the page looks like this

deletePage(data) {
    if (!this.get('isCreate')) {
      data.destroyRecord();
      this.set('isFormOpen', false);
    }
  }

In the above piece of code, we’re checking whether the form is in create mode or update mode and if it’s in create mode then we can destroy the record and then close the form.

The UI for the pop up looks like this.

Fig.3: The user interface for delete confirmation pop up

The code for the entire process of page creation to deletion can be checked here

To conclude, this is how we efficiently do the process of page creation, updating and deletion using the Open-Event-Orga pages API  ensuring that there is no unnecessary API call to fetch the data and no code duplication.

Resources:

Implementing Notifications in Open Event Server

In FOSSASIA’s Open Event Server project, along with emails, almost all actions have necessary user notifications as well. So, when a new session is created or a session is accepted by the event organisers, along with the email, a user notification is also sent. Though showing the user notification is mainly implemented in the frontend site but the content to be shown and on which action to show is strictly decided by the server project.

A notification essentially helps an user to get the necessary information while staying in the platform itself and not needing to go to check his/her email for every action he performs. So unlike email which acts as a backup for the informations, notification is more of an instant thing.

The API

The Notifications API is mostly like all other JSON API endpoints in the open event project. However in Notifications API we do not allow any to send a POST request. The admin of the server is able to send a GET a request to view all the notifications that are there in the system while a user can only view his/her notification. As of PATCH we allow only the user to edit his/her notification to mark it as read or not read. Following is the schema for the API:

class NotificationSchema(Schema):
    """
    API Schema for Notification Model
    """

    class Meta:
        """
        Meta class for Notification API schema
        """
        type_ = 'notification'
        self_view = 'v1.notification_detail'
        self_view_kwargs = {'id': '<id>'}
        self_view_many = 'v1.microlocation_list_post'
        inflect = dasherize

    id = fields.Str(dump_only=True)
    title = fields.Str(allow_none=True, dump_only=True)
    message = fields.Str(allow_none=True, dump_only=True)
    received_at = fields.DateTime(dump_only=True)
    accept = fields.Str(allow_none=True, dump_only=True)
    is_read = fields.Boolean()
    user = Relationship(attribute='user',
                        self_view='v1.notification_user',
                        self_view_kwargs={'id': '<id>'},
                        related_view='v1.user_detail',
                        related_view_kwargs={'notification_id': '<id>'},
                        schema='UserSchema',
                        type_='user'
                        )


The main things that are shown in the notification from the frontend are the
title and message. The title is the text that is shown without expanding the entire notification that gives an overview about the message in case you don’t want to read the entire message. The message however provides the entire detail that is associated with the action performed. The user relationship stores which user the particular notification is related with. It is a one-to-one relationship where one notification can only belong to one user. However one user can have multiple notifications. Another important attribute is the is_read attribute. This is the only attribute that is allowed to be changed. By default, when we make an entry in the database, is_read is set to FALSE. Once an user has read the notification, a request is sent from the frontend to change is_read to TRUE.

The different actions for which we send notification are stored in the models/notification.py file as global variables.

USER_CHANGE_EMAIL = "User email"'
NEW_SESSION = 'New Session Proposal'
PASSWORD_CHANGE = 'Change Password'
EVENT_ROLE = 'Event Role Invitation'
TICKET_PURCHASED = 'Ticket(s) Purchased'
TICKET_PURCHASED_ATTENDEE = 'Ticket(s) purchased to Attendee    '
EVENT_EXPORTED = 'Event Exported'
EVENT_EXPORT_FAIL = 'Event Export Failed'
EVENT_IMPORTED = 'Event Imported'

HTML Templates

The notification title and message that is stored in the database and later served via the Notification API is created using some string formatting HTML templates. We firstly import all the global variables that represent the various actions from the notification model. Then we declare a global dict type variable named NOTIFS which stores all title and messages to be stored in the notification table.

NEW_SESSION: {
        'title': u'New session proposal for {event_name}',
        'message': u"""The event <strong>{event_name}</strong> has received
             a new session proposal.<br><br>
            <a href='{link}' class='btn btn-info btn-sm'>View Session</a>""",
        'recipient': 'Organizer',
    },


This is an example of the contents stored inside the dict. For every action, there is a dict with attributes
title, message and recipient. Title contains the brief overview of the entire notification whereas message contains a more vivid description with proper links. Recipient contains the one who receives the notification. So for example in the above code snippet, it is a notification for a new session created. The notification goes to the organizer. The title and message contains named placeholders which are later replaced by particular values using python’s .format() function.

Notification Helper

Notification helper module contains two main parts :-

  1. A parent function which saves the notification to the table related to the user to whom the notification belongs.
  2. Individual notification helper functions that are used by the APIs to save notification after various actions are performed.

Parent Function

def send_notification(user, action, title, message):
    if not current_app.config['TESTING']:
        notification = Notification(user_id=user.id,
                                    title=title,
                                    message=message,
                                    action=action
                                    )
        save_to_db(notification, msg="Notification saved")
        record_activity('notification_event', user=user, action=action, title=title)


send_notification
() is the parent function which takes as parameters user, action, title and message and stores them in the notification table in the database. The user is the one to whom the notification belongs to, action represents the particular action in an API which triggered the notification. Title and message are the contents that are shown in the frontend in the form of a notification. The frontend can implement it as a dropdown notification like facebook or a desktop notification like gitter or whatsapp. After the notification is saved we also update the activity table with the action that a notification has been saved for a user with the following action and title. Later, the Notification API mentioned in the very beginning of the blog uses this data that is being stored now and serves it as a JSON response.

Individual Functions

Apart from this, we have individual functions that uses the parent function to store notifications particular to a particular actions. For example, we have a send_notif_new_session_organizer() function which is used to save notification for all the organizers of an event that a new session has been added to their particular event. This function is called when a POST request is made in the Sessions API and the data is saved successfully. The function is executed for all the organizers of the event for which the session has been created.

def send_notif_new_session_organizer(user, event_name, link):
    message_settings = MessageSettings.query.filter_by(action=NEW_SESSION).first()
    if not message_settings or message_settings.notification_status == 1:
        notif = NOTIFS[NEW_SESSION]
        action = NEW_SESSION
        title = notif['title'].format(event_name=event_name)
        message = notif['message'].format(event_name=event_name, link=link)

        send_notification(user, action, title, message)


In the above function, we take in 3 parameters, user, event_name and link. The value of the user parameter is used to link the notification to that particular user. Event_name and link are used in the title and message of the notification that is saved in the database. Firstly in the function we check if there is certain message setting which tells that the user doesn’t want to receive notifications related to new sessions being created. If not, we proceed. We get the title and message strings from the NOTIFS dict from the system_notification.py file.

After that, using string formatting we get the actual message. For example,

u'New session proposal for {event_name}'.format(‘FOSSASIA’)

would give us a resulting string of the form:

u'New session proposal for FOSSASIA'

After this, we use this variables and send them as parameters to the send_notification() parent function to save the notification properly.

 

Reference:

Implement Email in Open Event Server

In FOSSASIA’s Open Event Server project, we send out emails when various different actions are performed using the API. For example, when a new user is created, he/she receives an email welcoming him to the server as well as an email verification email. Users get role invites from event organisers in the form of emails, when someone buys a ticket he/she gets a PDF link to the ticket as email. So as you can understand all the important informations that are necessary to be notified to the user are sent as an email to the user and sometimes to the organizer as well.

In FOSSASIA, we use sendgrid’s API or an SMTP server depending on the admin settings for sending emails. You can read more about how we use sendgrid’s API to send emails in FOSSASIA here. Now let’s dive into the modules that we have for sending the emails. The three main parts in the entire email sending are:

  1. Model – Storing the Various Actions
  2. Templates – Storing the HTML templates for the emails
  3. Email Functions – Individual functions for various different actions

Let’s go through each of these modules one by one.

Model

USER_REGISTER = 'User Registration'
USER_CONFIRM = 'User Confirmation'
USER_CHANGE_EMAIL = "User email"
INVITE_PAPERS = 'Invitation For Papers'
NEXT_EVENT = 'Next Event'
NEW_SESSION = 'New Session Proposal'
PASSWORD_RESET = 'Reset Password'
PASSWORD_CHANGE = 'Change Password'
EVENT_ROLE = 'Event Role Invitation'
SESSION_ACCEPT_REJECT = 'Session Accept or Reject'
SESSION_SCHEDULE = 'Session Schedule Change'
EVENT_PUBLISH = 'Event Published'
AFTER_EVENT = 'After Event'
USER_REGISTER_WITH_PASSWORD = 'User Registration during Payment'
TICKET_PURCHASED = 'Ticket(s) Purchased'


In the Model file, named as
mail.py, we firstly declare the various different actions for which we send the emails out. These actions are globally used as the keys in the other modules of the email sending service. Here, we define global variables with the name of the action as strings in them. These are all constant variables, which means that there value remains throughout and never changes. For example, USER_REGISTER has the value ‘User Registration’, which essentially means that anything related to the USER_REGISTER key is executed when the User Registration action occurs. Or in other words, whenever an user registers into the system by signing up or creating a new user through the API, he/she receives the corresponding emails.
Apart from this, we have the model class which defines a table in the database. We use this model class to store the actions performed while sending emails in the database. So we store the action, the time at which the email was sent, the recipient and the sender. That way we have a record about all the emails that were sent out via our server.

class Mail(db.Model):
    __tablename__ = 'mails'
    id = db.Column(db.Integer, primary_key=True)
    recipient = db.Column(db.String)
    time = db.Column(db.DateTime(timezone=True))
    action = db.Column(db.String)
    subject = db.Column(db.String)
    message = db.Column(db.String)

    def __init__(self, recipient=None, time=None, action=None, subject=None,
                 message=None):
        self.recipient = recipient
        self.time = time
        if self.time is None:
            self.time = datetime.now(pytz.utc)
        self.action = action
        self.subject = subject
        self.message = message

    def __repr__(self):
        return '<Mail %r to %r>' % (self.id, self.recipient)

    def __str__(self):
        return unicode(self).encode('utf-8')

    def __unicode__(self):
        return 'Mail %r by %r' % (self.id, self.recipient,)


The table name in which all the information is stored is named as mails. It stores the recipient, the time at which the email is sent (timezone aware), the action which initiated the email sending, the subject of the email and the entire html body of the email. In case a datetime value is sent, we use that, else we use the current time in the time field.

HTML Templates

We store the html templates in the form of key value pairs in a file called system_mails.py inside the helpers module of the API. Inside the system_mails, we have a global dict variable named MAILS as shown below.

MAILS = {
    EVENT_PUBLISH: {
        'recipient': 'Organizer, Speaker',
        'subject': u'{event_name} is Live',
        'message': (
            u"Hi {email}<br/>" +
            u"Event, {event_name}, is up and running and ready for action. Go ahead and check it out." +
            u"<br/> Visit this link to view it: {link}"
        )
    },
    INVITE_PAPERS: {
        'recipient': 'Speaker',
        'subject': u'Invitation to Submit Papers for {event_name}',
        'message': (
            u"Hi {email}<br/>" +
            u"You are invited to submit papers for event: {event_name}" +
            u"<br/> Visit this link to fill up details: {link}"
        )
    },
    SESSION_ACCEPT_REJECT: {
        'recipient': 'Speaker',
        'subject': u'Session {session_name} has been {acceptance}',
        'message': (
            u"Hi {email},<br/>" +
            u"The session <strong>{session_name}</strong> has been <strong>{acceptance}</strong> by the organizer. " +
            u"<br/> Visit this link to view the session: {link}"
        )
    },
    SESSION_SCHEDULE: {
        'recipient': 'Organizer, Speaker',
        'subject': u'Schedule for Session {session_name} has been changed',
        'message': (
            u"Hi {email},<br/>" +
            u"The schedule for session <strong>{session_name}</strong> has been changed. " +
            u"<br/> Visit this link to view the session: {link}"
        )
    },


Inside the MAILS dict, we have key-value pairs, where in keys we use the global variables from the Model to define the action related to the email template. In the value, we again have 3 different key-value pairs – recipient, subject and message. The recipient defines the group who should receive this email, the subject goes into the subject part of the email while message forms the body for the email. For subject and message we use unicode strings with named placeholders that are used later for formatting using python’s
.format() function.

Email Functions

This is the most important part of the entire email sending system since this is the place where the entire email sending functionality is implemented using the above two modules. We have all these functions inside a single file namely mail.py inside the helpers module of the API. Firstly, we import two things in this file – The global dict variable MAILS defined in the template file above, and the various global action variables defined in the model. There is one main module which is used by every other individual modules for sending the emails defined as send_email(to, action, subject, html). This function takes as parameters the email to which the email is to be sent, the subject string, the html body string along with the action to store it in the database.

Firstly we ensure that the email address for the recipient is present and isn’t an empty string. After we have ensured this, we retrieve the email service as set in the admin settings. It can either be “smtp” or “sendgrid”. The email address for the sender has different formatting depending on the email service we are using. While sendgrid uses just the email say for example “medomag20@gmail.com”, smtp uses a format  a little different like this: Medozonuo Suohu<medomag20@gmail.com>. So we set that as well in the email_from variable.

def send_email(to, action, subject, html):
    """
    Sends email and records it in DB
    """
    if not string_empty(to):
        email_service = get_settings()['email_service']
        email_from_name = get_settings()['email_from_name']
        if email_service == 'smtp':
            email_from = email_from_name + '<' + get_settings()['email_from'] + '>'
        else:
            email_from = get_settings()['email_from']
        payload = {
            'to': to,
            'from': email_from,
            'subject': subject,
            'html': html
        }

        if not current_app.config['TESTING']:
            if email_service == 'smtp':
                smtp_encryption = get_settings()['smtp_encryption']
                if smtp_encryption == 'tls':
                    smtp_encryption = 'required'
                elif smtp_encryption == 'ssl':
                    smtp_encryption = 'ssl'
                elif smtp_encryption == 'tls_optional':
                    smtp_encryption = 'optional'
                else:
                    smtp_encryption = 'none'

                config = {
                    'host': get_settings()['smtp_host'],
                    'username': get_settings()['smtp_username'],
                    'password': get_settings()['smtp_password'],
                    'encryption': smtp_encryption,
                    'port': get_settings()['smtp_port'],
                }

                from tasks import send_mail_via_smtp_task
                send_mail_via_smtp_task.delay(config, payload)


After this we create the payload containing the email address for the recipient, the email address of the sender, the subject of the email and the html body of the email.
For unittesting and any other testing we avoid email sending since that is really not required in the flow. So we check that the current app is not configured to run in a testing environment. After that we have two different implementation depending on the email service used.

SMTP

There are 3 kind of possible encryptions for the email that can be used with smtp server – tls, ssl and optional. We determine this based on the admin settings again. Also, from the admin settings we collect the host, username, password and port for the smtp server.

After this we start a celery task for sending the email. Since email sending to a number of clients can be time consuming so we do it using the celery queueing service without disturbing the main workflow of the entire system.

@celery.task(name='send.email.post.smtp')
def send_mail_via_smtp_task(config, payload):
    mailer_config = {
        'transport': {
            'use': 'smtp',
            'host': config['host'],
            'username': config['username'],
            'password': config['password'],
            'tls': config['encryption'],
            'port': config['port']
        }
    }

    mailer = Mailer(mailer_config)
    mailer.start()
    message = Message(author=payload['from'], to=payload['to'])
    message.subject = payload['subject']
    message.plain = strip_tags(payload['html'])
    message.rich = payload['html']
    mailer.send(message)
    mailer.stop()

Inside the celery task, we use the Mailer and Message classes from the marrow module of python. We configure the Mailer according to the various settings received from the admin and then use the payload to send the email.

Sendgrid

For sending email using the sendgrid API, we need to set the Bearer key which is used for authenticating the email service. This key is also defined in the admin settings. After we have set the Bearer key as the authorization header, we again initiate the celery task corresponding to the sendgrid email sending service.

@celery.task(name='send.email.post')
def send_email_task(payload, headers):
    requests.post(
        "https://api.sendgrid.com/api/mail.send.json",
        data=payload,
        headers=headers
    )


For sending the email service, all we need to do is make a POST request to the api endpoint “
https://api.sendgrid.com/api/mail.send.json” with the headers which contains the Bearer Key and the data which contains the payload containing all the information related to the recipient, sender, subject of email and the body of the email.

Apart from these, this module implements all the individual functions that are called based on the various functions that occur. For example, let’s look into the email sending function in case a new session is created.

def send_email_new_session(email, event_name, link):
    """email for new session"""
    send_email(
        to=email,
        action=NEW_SESSION,
        subject=MAILS[NEW_SESSION]['subject'].format(
            event_name=event_name
        ),
        html=MAILS[NEW_SESSION]['message'].format(
            email=email,
            event_name=event_name,
            link=link
        )
    )


This function is called inside the Sessions API, for every speaker of the session as well as for every organizer of the event to which the session is submitted. Inside this function, we use the
send_email().  But firstly we need to create the subject of the email and the message body of the email using the templates and by replacing placeholders by actual value using python formatting. MAILS[NEW_SESSION] returns a unicode string: u’New session proposal for {event_name}’ . So what we do is use the .format() function to replace {event_name} by the actual event_name received as parameter. So it is equivalent to doing something like:

u'New session proposal for {event_name}'.format(‘FOSSASIA’)

which would give us a resulting string of the form:

u'New session proposal for FOSSASIA'

Similarly, we create the html message body using the templates and the parameters received. After this is done, we make a function call to send_email()  which then sends the final email.

References:

Implementing Roles API on Open Event Frontend to Create Roles Using an External Modal

This blog article will illustrate how the roles are created via the external model  on the admin permissions page in Open Event Frontend, using the roles API. Our discussion primarily will involve the admin/permissions/index route to illustrate the process.The primary end point of Open Event API with which we are concerned with for fetching the permissions  for a user is

POST /v1/roles

First we need to create a model for the user-permissions, which will have the fields corresponding to the api, so we proceed with the ember CLI command:

ember g model role

Next we define the model according to the requirements. The model needs to extend the base model class, and has only two fields one for the title and one for the actual name of the role.

import attr from 'ember-data/attr';
import ModelBase from 'open-event-frontend/models/base';

export default ModelBase.extend({
 name           : attr('string'),
 titleName      : attr('string')
 });

Next we need to modify the existing modal to incorporate the API and creation of roles in it. It is very important to note here that using createRecord as the model will result in a major flaw. If createRecord is used and the user tries to create multiple roles, other than the first POST request all the subsequent requests will be PATCH requests and will keep on modifying the same role. To avoid this, a new record needs to be created every time the user clicks on Add Role.  We slightly modify the modal component call to pass in the name and titleName to it.

{{modals/add-system-role-modal  isOpen=isAddSystemRoleModalOpen
                                isLoading=isLoading
                                name=name
                                titleName=titleName
                                addSystemRole=(action 'addSystemRole')}}

Upon entering the details of the roles and successful validation of the form, if the user clicks the Add Role button of the modal, the action addSystemRole will be triggered. We will write the entire logic for the same in the respective controller of the route.

addSystemRole() {
     this.set('isLoading', true);
     this.get('store').createRecord('role', {
       name      : this.get('name'),
       titleName : this.get('titleName')
     }).save()
       .then(() => {
         this.set('isLoading', false);
         this.notify.success(this.l10n.t('User permissions have 
         been saved successfully.'));
         this.set('isAddSystemRoleModalOpen', false);
         this.setProperties({
           name          : null,
           roleTitleName : null
         });
       })
       .catch(()=> {
         this.set('isLoading', false);
         this.notify.error(this.l10n.t('An unexpected error has occurred.
         User permissions not saved.'));
       });
   },

At first the isLoading property is made true.This adds the semantic UI class loading to the the form,  and so the form goes in the loading state, Next, a record is created of the type role  and it’s properties are made equal to the corresponding values entered by the user.

Then save() is called, which subsequently makes a POST request to the server. If the request is successful the modal is closed by setting the isAddSystemRoleModalOpen property to false. Also, the fields of the modal are cleared for a  better user experience in case multiple roles need to be added one after the other.

In cases when  there is an error during the processing of the request the catch() block executes. And the modal is not closed. Neither are the fields cleared.

Resources

Implementing Users API to Display the Users at Admin Route in Open Event Frontend

This article will illustrate how the users are displayed and updated on the /admin/users route, their roles, user links etc. using the users API in Open Event Frontend. The primary end point of Open Event API with which we are concerned with for fetching the users is

GET /v1/users

First, we need to create a model for the user, which will have the fields corresponding to the API, so we proceed with the ember CLI command:

ember g model user

Next, we need to define the model according to the requirements. The model needs to extend the base model class. As a user can have multiple notifications, orders and  sessions etc. so we have to use ember data relationships “hasMany”. Hence, the model will have the following format.

import ModelBase from 'open-event-frontend/models/base';
import { hasMany } from 'ember-data/relationships';

export default ModelBase.extend({
  email        : attr('string'),
  password     : attr('string'),
  isVerified   : attr('boolean', { readOnly: true }),
  isSuperAdmin : attr('boolean', { readOnly: true }),
  isAdmin      : attr('boolean', { readOnly: true }),
  firstName : attr('string'),
  lastName  : attr('string')
});

The complete code for the model can be seen here

Now, we need to load the data from the API using the above model, so we will send a GET request to the API to fetch the users. This can be easily achieved using this.

return this.get('store').query('user', {'page[size]': 10 });

The above line is querying for the users from the store which is place where cache of all of the records that have been loaded by our application is there. If a route asks for a record, the store can return it immediately if it is there in the cache and we want to display only 10 users in a page so defined how many number of users has to be loaded at a time.

Now we need to filter the users based on whether they are active or they have deleted their accounts. For this purpose, we need to pass filter to the query which will tell what type of users to be loaded at once.

The next thing we need to do is to display the above data fetched from the API into an ember table. Ember table helps in allowing us to render very large data sets by only rendering the rows that are being displayed. For this, we defined a controller class which will help in letting the table know what all columns will be required to display and the attribute values they correspond in the API. We can also define the template for each column. The code for the controller class looks like this.

import Ember from 'ember';

const { Controller } = Ember;
export default Controller.extend({
  columns: [
    {
      propertyName     : 'first-name',
      title            : 'Name',
      disableSorting   : true,
      disableFiltering : true
    },
    {
      propertyName     : 'email',
      title            : 'Email',
      disableSorting   : true,
      disableFiltering : true
     },
     {
      propertyName     : 'last-accessed-at',
      title            : 'Last Accessed',
      template         : 'components/ui-table/cell/admin/users/cell-last-accessed-at',
      disableSorting   : true,
      disableFiltering : true
    }
   ]
});

In the above code, we can see a field called ‘disableSorting’ which is true if we don’t want to sort the table based on that column. Since we want the last-accessed-at column to be customized, so we have separately added a template for the column which will ensure how it will look in the column. The complete code for the other columns which are there in table apart from these can be found here.

Now to display the ember table we will write the following code.

{{events/events-table columns=columns data=model
    useNumericPagination=true
    showGlobalFilter=true
    showPageSize=true
}}

In the above piece of code, we are calling the same ember table as we used in case of events to reduce the code duplication. We are passing the columns and data in the table which remains unique to the table. Next, we are ensuring that our page shows the amount of data we’re fetching at one go, allows the filtering the table based on the columns.

The UI of the users page for the above code snippets look like this.

Fig 1: The UI of the users table under admin/users route

The entire code for implementing the users API can be seen here.

To conclude, this is how we efficiently fetched the users using the Open-Event-Orga users API, ensuring that there is no unnecessary API call to fetch the data and no code duplication using the same ember table again.

Resources:

Implementing Sessions API for the event in Open Event Frontend

This article will illustrate how the sessions are displayed and updated on the events/{event_id}/sessions route to display the sessions available for a particular event using the sessions API in Open Event Frontend. The primary end point of Open Event API with which we are concerned with for fetching the sessions is

GET /v1/sessions/{session_id}

First, we need to create a model for the sessions, which will have the fields corresponding to the API, so we proceed with the ember CLI command:

ember g model session

Next, we need to define the model according to the requirements. The model needs to extend the base model class. As a session can have multiple speakers and a session always belongs to an event, so we have to use ember data relationships “hasMany” and “belongsTo”. Hence, the model will have the following format.

import ModelBase from 'open-event-frontend/models/base';
import { belongsTo, hasMany } from 'ember-data/relationships';

export default ModelBase.extend({
  title         : attr('string'),
  subtitle      : attr('string'),

  speakers      : hasMany('speaker'),
  event         : belongsTo('event')
});

Complete code for the model can be seen here

Now, we need to load the data from the API using the above model, so we will send a GET request to the API to fetch the sessions corresponding to a particular event. This can be easily achieved using this.

return this.modelFor('events.view').query('sessions');

The above line is asking for getting the current model that is on the route events.view and query for the sessions property from that model.

Now we need to filter the sessions based on their sessions whether they have been accepted or confirmed or pending or rejected and display them on different pages. For this purpose, we need to pass filter and pages to the query which will tell what type and now of sessions to be loaded at once. Also, we need to display the speakers associated with session and event details. For this case, the above query will be formatted like this.

return this.modelFor('events.view').query('sessions', {
      include      : 'event,speakers',
    filter       : filterOptions,
      'page[size]' : 10
    });  

In the above query, the filterOptions are designed in such a way which check for what type of sessions user is querying for. The code can be found here.

The next thing we need to do is to display the above data fetched from the API into an ember table. For this, we need to have a controller class which will help in letting the table know what all columns will be required to display and the attribute values they correspond in the API. We can also define the template for each column. The code for the controller class looks like this.

export default Controller.extend({
  columns: [
    {
      propertyName   : 'state',
      title          : 'State',
      disableSorting : true,
      template       : 'components/ui-table/cell/events/view/sessions/cell-session-state'
    },
    {
      propertyName : 'title',
      title          : 'Title'
    },
    {
      propertyName    : 'speakers',
      template       : 'components/ui-table/cell/cell-speakers',
      title          : 'Speakers',
      disableSorting  : true
     }]
});

In the above code, we can see a field called ‘disableSorting’ which is true if we don’t want to sort the table based on that column. Since we want the state column to be customized, so we have separately added a template for the column which will ensure how it will look in the column. The complete code for the other columns which are there in table apart from the state, title and speakers can be found here.

Now to display the ember table we will write the following code.

{{events/events-table columns=columns data=model
    useNumericPagination=true
    showGlobalFilter=true
    showPageSize=true
}}
I

In the above piece of code, we are calling the same ember table as we used in case of events to reduce the code duplication. We are passing the columns and data in the table which remains unique to the table. Next, we are ensuring that our page shows the amount of data we’re fetching at one go, allows the filtering the table based on the columns.

The UI of the sessions page for the above code snippets look like this.

Fig 1: The UI of the session table under events/{event_id}/session route

The entire code for implementing the sessions API can be seen here.

To conclude, this is how we efficiently fetched the sessions details using the Open-Event-Orga sessions API, ensuring that there is no unnecessary API call to fetch the data and no code duplication using the same ember table again.

Resources:

Create Event by Importing JSON files in Open Event Server

Apart from the usual way of creating an event in  FOSSASIA’s Orga Server project by using POST requests in Events API, another way of creating events is importing a zip file which is an archive of multiple JSON files. This way you can create a large event like FOSSASIA with lots of data related to sessions, speakers, microlocations, sponsors just by uploading JSON files to the system. Sample JSON file can be found in the open-event project of FOSSASIA. The basic workflow of importing an event and how it works is as follows:

  • First step is similar to uploading files to the server. We need to send a POST request with a multipart form data with the zipped archive containing the JSON files.
  • The POST request starts a celery task to start importing data from JSON files and storing them in the database.
  • The celery task URL is returned as a response to the POST request. You can use this celery task for polling purposes to get the status. If the status is FAILURE, we get the error text along with it. If status is SUCCESS we get the resulting event data
  • In the celery task, each JSON file is read separately and the data is stored in the db with the proper relations.
  • Sending a GET request to the above mentioned celery task, after the task has been completed returns the event id along with the event URL.

Let’s see how each of these points work in the background.

Uploading ZIP containing JSON Files

For uploading a zip archive instead of sending a JSON data in the POST request we send a multipart form data. The multipart/form-data format of sending data allows an entire file to be sent as a data in the POST request along with the relevant file informations. One can know about various form content types here .

An example cURL request looks something like this:

curl -H "Authorization: JWT <access token>" -X POST -F 'file=@event1.zip' http://localhost:5000/v1/events/import/json

The above cURL request uploads a file event1.zip from your current directory with the key as ‘file’ to the endpoint /v1/events/import/json. The user uploading the feels needs to have a JWT authentication key or in other words be logged in to the system as it is necessary to create an event.

@import_routes.route('/events/import/<string:source_type>', methods=['POST'])
@jwt_required()
def import_event(source_type):
    if source_type == 'json':
        file_path = get_file_from_request(['zip'])
    else:
        file_path = None
        abort(404)
    from helpers.tasks import import_event_task
    task = import_event_task.delay(email=current_identity.email, file=file_path,
                                   source_type=source_type, creator_id=current_identity.id)
    # create import job
    create_import_job(task.id)

    # if testing
    if current_app.config.get('CELERY_ALWAYS_EAGER'):
        TASK_RESULTS[task.id] = {
            'result': task.get(),
            'state': task.state
        }
    return jsonify(
        task_url=url_for('tasks.celery_task', task_id=task.id)
    )


After the request is received we check if a file exists in the key ‘file’ of the form-data. If it is there, we save the file and get the path to the saved file. Then we send this path over to the celery task and run the task with the
.delay() function of celery. After the celery task is started, the corresponding data about the import job is stored in the database for future debugging and logging purposes. After this we return the task url for the celery task that we started.

Celery Task to Import Data

Just like exporting of event, importing is also a time consuming task and we don’t want other application requests to be paused because of this task. Hence, we use a celery queue to execute this task. Whenever an import task is started, it is added to the celery queue. When it comes to the front of the queue it is executed.

For importing, we have created a celery task, import.event which calls the import_event_task_base() function that uses the import helper functions to get the data from JSON files imported and saved in the DB. After the task is completed, we update the import job data in the table with the status as either SUCCESS or FAILURE depending on the outcome of the celery task.

As a result of the celery task, the newly created event’s id and the frontend link from where we can visit the url is returned. This along with the status of the celery task is returned as the response for a GET request on the celery task. If the celery task fails, then the state is changed to FAILURE and the error which the celery faced is returned as the error message in the result key. We also print an error traceback in the celery worker.

@celery.task(base=RequestContextTask, name='import.event', bind=True, throws=(BaseError,))
def import_event_task(self, file, source_type, creator_id):
    """Import Event Task"""
    task_id = self.request.id.__str__()  # str(async result)
    try:
        result = import_event_task_base(self, file, source_type, creator_id)
        update_import_job(task_id, result['id'], 'SUCCESS')
        # return item
    except BaseError as e:
        print(traceback.format_exc())
        update_import_job(task_id, e.message, e.status if hasattr(e, 'status') else 'failure')
        result = {'__error': True, 'result': e.to_dict()}
    except Exception as e:
        print(traceback.format_exc())
        update_import_job(task_id, e.message, e.status if hasattr(e, 'status') else 'failure')
        result = {'__error': True, 'result': ServerError().to_dict()}
    # send email
    send_import_mail(task_id, result)
    # return result
    return result

 

Save Data from JSON

In import helpers, we have the functions which perform the main task of reading the JSON files, creating sqlalchemy model objects from them and saving them in the database. There are few global dictionaries which help maintain the order in which the files are to be imported and saved and also the file vs model mapping. The first JSON file to be imported is the event JSON file. Since all the other tables to be imported are related to the event table so first we read the event JSON file. After that the order in which the files are read is as follows:

  1. SocialLink
  2. CustomForms
  3. Microlocation
  4. Sponsor
  5. Speaker
  6. Track
  7. SessionType
  8. Session

This order helps maintain the foreign constraints. For importing data from these files we use the function create_service_from_json(). It sorts the elements in the data list  based on the key “id”. It then loops over all the elements or dictionaries contained in the data list. In each iteration delete the unnecessary key-value pairs from the dictionary. Then set the event_id for that element to the id of the newly created event from import instead of the old id present in the data.  After all this is done, create a model object based on the mapping with the filename with the dict data. Then save that model data into the database.

def create_service_from_json(task_handle, data, srv, event_id, service_ids=None):
    """
    Given :data as json, create the service on server
    :service_ids are the mapping of ids of already created services.
        Used for mapping old ids to new
    """
    if service_ids is None:
        service_ids = {}
    global CUR_ID
    # sort by id
    data.sort(key=lambda k: k['id'])
    ids = {}
    ct = 0
    total = len(data)
    # start creating
    for obj in data:
        # update status
        ct += 1
        update_state(task_handle, 'Importing %s (%d/%d)' % (srv[0], ct, total))
        # trim id field
        old_id, obj = _trim_id(obj)
        CUR_ID = old_id
        # delete not needed fields
        obj = _delete_fields(srv, obj)
        # related
        obj = _fix_related_fields(srv, obj, service_ids)
        obj['event_id'] = event_id
        # create object
        new_obj = srv[1](**obj)
        db.session.add(new_obj)
        db.session.commit()
        ids[old_id] = new_obj.id
        # add uploads to queue
        _upload_media_queue(srv, new_obj)

    return ids


After the data has been saved, the next thing to do is upload all the media files to the server. This we do using the
_upload_media_queue()  function. It takes paths to upload the files to from the storage.py helper file for APIs. Then it uploads the files using the various helper functions to the static data storage services like AWS S3, Google storage, etc.

Other than this, the import helpers also contains the function to create an import job that keeps a record of all the imports along with the task url and the user id of the user who started the importing task. It also stores the status of the task. Then there is the get_file_from_request()  function which saves the file that is uploaded through the POST request and returns the path to that file.

Get Response about Event Imported

The POST request returns a task url of the form /v1/tasks/ebe07632-392b-4ae9-8501-87ac27258ce5. To get the final result, you need to keep polling this URL. To know more about polling read my previous blog about exporting event or visit this link. So when the task is completed you would get a “result” key along with the status. The state can either be SUCCESS or FAILURE. If it is a FAILURE you will get a corresponding error message due to which the celery task failed. If it is a success then you get data related to the corresponding event that was created because of import. The data returned are the event id, event name and the event url which you can use to visit the event from the frontend. This data is also sent to the user as an email and notification.

An example response looks something like this:

{ 
    “result”: {
“event_name” : “FOSSASIA 2016”,
     “id” : “24”,
     “url” : “https://eventyay.com/events/ab3de6
},
    “state” : “SUCCESS”
}

The corresponding event name and the url is also sent to the user who started the import task. From the frontend, one can use the object value of the result to show the name of the event that is imported along with providing the event url. Since the id and identifier are both present in the result returned one can also make use of them to send GET, PATCH and other API requests to the events/ endpoint and get the corresponding relationship urls from it to query the other APIs. Thus, the entire data that is imported gets available to the frontend as well.

 

Reference Links:

 

Implementing Admin Statistics Mail and Session API on Open Event Frontend

This blog article will illustrate how the admin-statistics-mail and admin-statistics-session API  are implemented on the admin dashboard page in Open Event Frontend.Our discussion primarily will involve the admin/index route to illustrate the process.The primary end points of Open Event API with which we are concerned with for fetching the admin statistics  for the dashboard are

GET /v1/admin/statistics/mails
GET /v1/admin/statistics/sessions

First we need to create the corresponding models according to the type of the response returned by the server , which in this case will be admin-statistics-event and admin-statistics-sessions, so we proceed with the ember CLI commands:

ember g model admin-statistics-mail
ember g model admin-statistics-session

Next we define the model according to the requirements. The model needs to extend the base model class, and all the fields will be number since the all the data obtained via these models from the API will be numerical statistics

import attr from 'ember-data/attr';
import ModelBase from 'open-event-frontend/models/base';

export default ModelBase.extend({
 oneDay     : attr('number'),
 threeDays  : attr('number'),
 sevenDays  : attr('number'),
 thirtyDays : attr('number')
});

And the model for sessions will be the following. It too will consist all the attributes of type number since it represents statistics

import attr from 'ember-data/attr';
import ModelBase from 'open-event-frontend/models/base';

export default ModelBase.extend({
 confirmed : attr('number'),
 accepted  : attr('number'),
 submitted : attr('number'),
 draft     : attr('number'),
 rejected  : attr('number'),
 pending   : attr('number')
});

Now we need to load the data from the api using the models, so will send a get request to the api to fetch the current permissions. This can be easily achieved via a store query in the model hook of the admin/index route.However this cannot be a normal get request. Because the the urls for the end point are /v1/admin/statistics/mails & /v1/admin/statistics/sessions but there are no relationships between statistics and various sub routes, which is what ember’s default behaviour would expect.

Hence we need to override the generated default request url using custom adapters and use buildUrl method to customize the request urls.

import ApplicationAdapter from './application';

export default ApplicationAdapter.extend({
 buildURL(modelName, id, snapshot, requestType, query) {
   let url = this._super(modelName, id, snapshot, requestType, query);
   url = url.replace('admin-statistics-session', 'admin/statistics/session');
   return url;
 }
});

The buildURL method replaces the the default  URL for admin-statistics-session  with admin/statistics/session otherwise the the default request would have been

GET v1/admin-statistics-session

Similarly it must be done for the mail statistics too. These will ensure that the correct request is sent to the server. Now all that remains is making the requests in the model hooks and adjusting the template slightly for the new model.

model() {
   return RSVP.hash({
         mails: this.get('store').queryRecord('admin-statistics-mail', {
       filter: {
         name : 'id',
         op   : 'eq',
         val  : 1
       }
     }),
     sessions: this.get('store').queryRecord('admin-statistics-session', {
       filter: {
         name : 'id',
         op   : 'eq',
         val  : 1
       }
     })
   });
 }


queryRecord is used instead of query because only a single record is expected to be returned by the API.

Resources

Tags :

Open event, Open event frontend, ember JS, ember service, semantic UI, ember-data, ember adapters,  tickets, Open Event API, Ember models