Backend Directory Structure
The SmrtCCops Vitalsigns system backend deployment resides on vitalsigns1.smrttouch.com in the vitalsigns user home directory - /home/vitalsigns. In /home/vitalsigns there is a single parent input folder /home/vitalsigns/input. This folder is the primary input folder for the original TelcoOps implementation and is still used for that purpose. The SmrtCCops input folders reside under this one as follows:
/home/vitalsigns/callcenter - shared folder for common lookups
/home/vitalsigns/callcenter/elead - input directory for elead data
/home/vitalsigns/callcenter/twc - input directory for Charter (formerly Time Warner)
/home/vitalsigns/callcenter/pccw - input directory for PCCW data
/home/vitalsigns/callcenter/elead - input directory for elead data
/home/vitalsigns/callcenter/infocus - input directory for infocus data
/home/vitalsigns/callcenter/la - input directory for LinkActiv data
/home/vitalsigns/callcenter/teg - input directory for Teg data
For archival of the inputs there is a single top level archive directory /home/vitalsigns/archive. The uses and directory structure under it mirror the input directory structure.
In a similar structure there are a set of work directories where the data extraction routines log the raw webhook data as well intermediate files with transformed webhook data (flattened JSON structure, XML converted to JSON). These directories are rooted at /home/vitalsigns/work and have one directory per client under that root. The client directories are named the same as they are in the input and archive directory structures.
Other directories of note:
/home/vitalsigns/bin - location of shell scripts.
/home/vitalsigns/nbwebutil - location of nodejs scripts used for data acquisition.
/home/vitalsigns/recovery_tools - location of nodejs scripts used in reprocessing of regenerated webhook data for elead.
/home/vitalsigns/backup - location for backups of postgress database.
SmrtCCOps Processing
All SmrtCCOps clients follow the same data acquisition and processing strategy. At present there are two shell scripts scheduled via cron that drive this processing. They are both scheduled to run at 15 minute intervals, offset from each other by 2 minutes.
/home/vitalsigns/bin/loadElead.sh drives the processing for Elead. It extracts the webhook data from redis, writing the final output the the elead input directory. It also extracts lead information and state from the Tenant's Mysql List databases, also writing the extracts to the input directory. It then calls the batch load process to process the data into Vitalsigns.
/home/vitalsigns/bin/LoadProd.sh is used to load the remaining clients. It follows a similar pattern in extracting web hook data for each client and writing it to the corresponding input directory. For twc there are additional scripts called to extract lead information from the mysql lead database. It then calls the batch load process to process the data to Vitalsigns.
In addition to the webhook data, SMRT projects, number groups and numbers are extracted from the SMRT Api 5 times a day and loaded to Vitalsigns as reference data. The extraction runs on vitalsigns2.smrttouch.com. The script used is /home/vitalsigns/smrtextract/getRefTablesPrd.sh. The extracted data is transfered to vitalsigns1 in the /home/vitalsigns/input/callcenter directory.