Candidate Data Transfers framework for Educational Testing Services Company

Business Problem/Scope of Work

Series of standardized assessment used in Nebraska / Alaska public primary and secondary schools to assess a student's achievements and knowledge learned in the grade level.  Scope of Work is to develop online assessment and administration framework responsible to manage transfer of the candidate registration and test data including responses, scores and other related data to other vendor systems. This module closely integrates with the Test Packager, Student Session, Constraint Engine to retrieve test metadata, Management Systems to retrieve candidate data and Test Delivery System to retrieve candidate responses and test session data.

Business Solution

Develop Candidate Data Transfers framework which will

  • Support multiple test programs within the same deployment and allow program specific customizations.
  • Develop the process to have multiple phases and/or steps and allow each phase to be configured at the level of the test program.
  • Design a phase to be independent and limit the dependencies it might have on other phases to minimize new deployments and builds.
  • Easily compose the approved phases within a test program with the ability to schedule those processes for unattended automated execution.
  • Provide CA&L internal web monitoring and control UIs for the automated processes
  • Use APIs exposed by other components and limit direct access to other component databases, to aggregate and transform data.
  • Ability to process multiple candidates to expedite the transferring process.

Technical Solution

  • Amazon EC2 instances allow to deploy Java applications to AWS using their existing application deployment tools and processes, or to integrate Java application deployment with automated deployment tools and services such as AWS CodeDeploy or AWS OpsWorks. 
  • Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance will be used to store the Candidate Admin records files which provides highest durability.
  • Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables distribution of load in CDT BAT instances.
  • AWS Batch dynamically provisions the optimal quantity and type of compute resources (e.g., CPU or memory optimized instances) based on the volume and specific resource requirements of the batch jobs submitted
  • Implementation include spring batch job. For horizontal scaling and considering the need for future scalability, and making use of partitioning (using Partitioner and PartitionHandlers) rather than remote chunking
  • In this context, will have spring task executor at one node execute steps in parallel.
  • While the output format can be an XML*,  using json formats to store the project dependent property and value pairs within the database* in this multi-tenat system.
  • Groovy utility class to validation of the entire xml with xsd 
  • The reader would read all records, partition into the number of partitions specified and invoke independent processes to process and write out to the CRDS interface.
  • The asyncTaskExecutor of the Spring framework would be used for this for now in a single tomcat instance. 

Technologies/Skills Used

  • Programming Languages: Java 1.8, Groovy
  • Frameworks: Spring (Core, Web Services, Batch), Apache Common Chain**, hibernate (4.x and 5.x), log4j, EhCache
  • Development Tools: Maven, Jenkins, SVN, JIRA, Confluence
  • Database: Postgresql
  • Cloud : AWS
  • Cloud Components : AWA S3, AWS sqs
  • OS: Cent OS, RedHat Enterprise Linux

Customer Success Outcomes

  • New Customers Onboarded (Initially Platform was supporting only Texas STAAR now 2 more states added Nebraska and Alaska)
  • Ease of maintenance of the application using AWS Monitoring services.
  • Improved Performance: Application was able to serve 190K concurrent Users.