We provide IT Staff Augmentation Services!

Computer Systems Analyst Resume

4.00/5 (Submit Your Rating)

Richardson, TX

SUMMARY:

  • Over 17+ years of progressive Data Engineering work experience as a senior Software Engineer, Data Architect, Data Modeler, Systems DBA, and Data Engineer/developer (server).
  • Conceptualized and proposed the Architectural and Infrastructural plans and deployment phases of complex operating environments (HP UNIX, RHEL Linux) and database management systems using Oracle, Postgres, NoSQL MongoDB.
  • Gathered and analyzed business requirements, developed use cases, created logical - physical & process diagrams, translated complex requirements into long term solutions on several projects.
  • Analyzed, evaluated, planned, installed, and implemented several new solutions; after coordinating and presenting them to Customer, Systems Engineers, Software Team Leads, System/network Administrators, and Project Managers.
  • Authored design documents, created ER/logical/physical/process/data flow diagrams. Transformed UML data structures into database physical structures.
  • Guided over six professional Systems DBAs of at&t, in support of the Postgres data stores in distributed environment of several data centers across the country.
  • Implemented best practice procedures by updating the configurations and tuning the open source products: Cassandra and Postgres
  • Proposed and Influenced the capacity and storage plans of Data Stores for database servers, log files, backup files on SAN, NAS, FTP, local Filesystems, and Amazon S3 mounts (cloud storage).
  • Enhanced software development processes, by developing dynamic re-usable procedures and automating unit and integration test procedures.
  • Supported and managed enterprise level database servers; providing High Availability, Backup Recovery (online, offline), Disaster Recovery, Security (at DB object, user & OS levels), and performance tuning (of database servers and SQL queries).
  • Automated several DBA procedures including backup recovery, database cloning/syncing operations, Oracle Postgres database installation and configuration, and data migration between Postgres and Oracle.
  • Designed and developed several tools to extract transform and load data updates from tabular and special/geographic data formats. These tools have been used to clean, migrate/transform, integrate/merge data sets. Also developed several data aggregation tools to help prepare data for daily, weekly, monthly, and ad-hoc analytics queries.
  • Authored documents of top level design, detail design, test plan/procedures, and user manuals.

TECHNICAL SKILLS:

Data Modeler/Architect ( 13+ years ): Logical and Physical data modeling, ER diagrams, Process models, UML diagrams, Business rules, Master Data Management (MDM), Data Governance, eData Assurance (integrity, availability, authenticity), Data Migration/Integration, Data Curation, Data Retention (records management, data archival, data purge).

Database Administrator ( 7+ yrs): Oracle 7 10g 11g2, MongoDB 2 (2+yr), Postgres 9 (5+ yrs)

Big Data Systems (broad knowledge): Big Data Analysis/Science (SparkSQL, SparkML, Spark Python) including big data Cleaning/Curation and Integration; Modern Databases ( Cassandra, MongoDB, HP-Vertica, Hadoop ecosystem); Distributed Computing Platforms, Analytics with Fast Algorithms, Machine Learning tools.

Data Engineer/Developer (14+ yrs): Expert in Oracle PL/SQL SQL Pro*C UNIX/Linux Shell scripting (Korn, bash), (3+yrs) Python, psql PL/pgSql, C, (2+yrs 2008) Java, Perl, (1991) VAX COBOL-RDB.

Operating Systems ( 14+ years): HP UNIX 10, 11i, (last 5+yrs)-Red Hat Enterprise Linux 6, (3+yrs)-Solaris UNIX, (5+yrs)-Windows NT, (4+ yrs), CLIX UX (6+ yrs), VAX-VMS (2+ yrs)

Virtual Machines: Linux virtual machines in multiple Data Centers, Oracle VirtualBox, VMware Workstation 9, 10 (Linux nodes for multiple uses).

Networking configured: TCP/IP, FTP, DHCP, NFS mounts, and iSCSI drives, DNS server (Linux)

High Availability (last 3+ yrs): Oracle 11g Grid, RAC, Active Data Guard (standby DB), Replication Streams, and Postgres (replication streaming with pgPool-II 3.2).

Disaster Recovery(2+ yrs): Oracle Data Guard, Flashback Recovery, RMAN, Postgres replication

Tools & Utilities used: ODBC, JDBC, ssh/rsh (secure/remote shell)

Oracle: Enterprise Manager OEM, Automatic Storage Management ASM, Backup & Recovery Manager RMAN, Flashback (Table, Query, Data Archive, Database), Tablespace Point in Time Recovery (TSPITR), Export(expdp)/Import(impdp), Oracle Cluster File System OCFS, Scheduler, and Diagnostic

Postgres: Enterprise Manager PEM 3, pgAdmin, psql, pg dump, pg restore, pg ctl, etc.

Development Platform & Design Tools used: Rational Apex, Oracle Designer, Rational Case tool, MyEclipse, evaluated ERWIN, Tableau

PROFESSIONAL EXPERIENCE:

Confidential, Richardson, TX

Computer Systems Analyst

Responsibilities:

  • Analyzed, identified and resolved the performance issues in APIGEE application for Business Intelligence queries and reports. The proposed solution lead to the optimized: Network, Linux filesystem, memory management, and upgrade of Computer hardware. Now, query results are available within 2 minutes for up to 15 million records instead of 5 million records only.
  • Analyzed and coordinated performance issues of tCOTS/OAuth application with Development and DevOPS teams.
  • Worked with the team of professionals that installs, upgrades, apply patches, configures, and manages the Cassandra and Postgres servers for APIGEE and OAuth applications.
  • Implemented best practice procedures by updating the configurations and tuning the open source products: Cassandra and Postgres.
  • Analyzed Cassandra tables’ storage and structures, for their scalability and performance.
  • Evaluated the metadata stored in Cassandra tables for APIGEE (apigee.com) application that provides APIs on a laptop, desktop, tablets, cellphones, etc.
  • Analyzed the capacity of the network and physical storage for the Cassandra and Postgres servers, running in several Data Centers.
  • Configured, monitored, and enhanced the distributed High Availability DB servers of OLAP and OLTP applications running in several data centers.
  • Developed scripts to archive and manage data updates to Amazon S3 (cloud storage).
  • Designed and Developed procedures and scripts to automate: backup recovery (online, offline, standalone), data replication monitoring, and failover / switchover between master and standby servers.
  • Authored several documents of current and updated configuration with tuned parameters.
  • Analyzed and troubleshot several problems that occurred in production and fixed them, for example, table locks in XDB, long running queries, filesystem 100% full.

Confidential,

Data Modeler/Architect

Responsibilities:

  • Contributed to enhance and deploy the large-scale project “Canadian Automated Air Traffic System (CAATS)” as a Technical Lead of Data Engineering team of 6 engineers.
  • Develop detail design solutions, after assisting Systems Engineers and coordinating all Data requirements and enhancements of CAATS adaptation and resource data availability at any ATS aerodrome facility in Canada.
  • Coach DE team members in the software development lifecycle SDLC.
  • Maintain and support Oracle, Postgres database servers running on HP UNIX 11i, Windows, and Red Hat Enterprise Linux 6.
  • Analyzed, Designed and developed the top level and detail design documents, coordinated them with the customers, engineering team leads, and also led the software development within Data Engineering.
  • Designed, developed and implemented Data/Process models under MDM concepts, providing a single point of reference to several ATS facilities across the country, where master data is managed at the federal level. This has streamlined data sharing among several departments (Operation, AIS, Facilities management, Finance, etc), removed duplicates, standardized mass data, eliminated incorrect data after incorporating business rules, etc.
  • Enhanced and maintained Data Models implementing Data Governance concepts on CAATS adaptation and resource data. It includes entity relationship ER diagrams, logical/physical models, Entity/attribute descriptions, business rules, process models etc. These models include following areas of information: Aeronautical Data, Weather data, Application resource data, Scheduling, Gateways, Networks, User Interfaces, Business rule base, Hardware Configuration data, Tracking resource data, Billing resource data, etc.
  • Coached DE team on SDLC; it includes processes of software development, unit/integration tests, version controls (of code units, data files, log files, etc), peer-review/review code updates, and code propagation into CAATS software integration.
  • Assisted Systems Engineers on data requirements of CAATS enhancements.
  • Created and updated schema (tables/views structure) creation scripts implementing physical models.
  • Designed and developed dynamic reusable PL/SQL stored procedures (software code) with SQL - UNIX/Linux Korn/bash shell scripts to implement the business logic/rules and business process models. It includes Unit test plan/procedure scripts to validate the implementation code against design/requirements.
  • Designed and authored the migration path documents of Oracle DB servers, Development Platforms, and Middle tier software products from HP UNIX 11 to RHEL 6 servers and presented them to Systems Engineers, Team leads & Project Managers.
  • Analyzed, designed and authored Top-level and detail design documents, to store the inactive UML-class data objects of real-time CAATS application, to off-line ER tables to Dimensional Data Model tables in Postgres database server and allow analytic queries and report generation capabilities on off-line data. The detailed design included the logical/physical data models, data retention design, sample report formats, etc.
  • Proposed and Developed several ETL tools to migrate, merge, and integrate data into CAATS data model from several sources including Aeronautical Information System Data in (XML, CSV, MS excel, database export dump files) formats, RADAR data in text file format of horizontal and vertical geographic data points.
  • Migrated database servers from Oracle 7 (HP UNIX) to Oracle 11g on RHEL6, also upgraded Data Models from Designer 2000 to Designer 9i. Also migrated 25% Data Models (Entity Relations diagrams) to UML2 (Class Models).
  • Engineered and developed the deployment scripts, to implement the Data Model updates, performance tuning updates, and database configuration updates into the Postgres database servers running 24/7 in production.
  • Analyzed and troubleshot the following:
  • Evaluated: MyEclipse for Development Platform, Toad Data Modeler and CA-Erwin for Data Modeling, Postgres with pgPool-II, Slony, Bucardo (data replication) for reporting Database, and several Business Intelligence Reporting Tools (Oracle Discoverer, BI publisher, jReport, Pentaho, Eclipse BIRT), Big Data database MongoDB for storing real time application logs to perform analytics and generate reports.
  • Installed, configured, and Supported NoSQL MongoDB on RHEL 6, in development and test environments, to archive log files and perform analytics on them.
  • Transformed and loaded the UML structured JSON format data stream/logs, of a real-time application, into MongoDB and performed analytic queries.
  • Installed, configured and supported Oracle and Postgres database servers running on HP-UNIX 11, RHEL 5, 6 for Development, Test, and Production environments.
  • Tuned database servers, SQL queries, PL/SQL functions/procedures for their performance.
  • Coordinated with System/network administrators and clients resolving problems with Operating Systems, network, storage, and applications.
  • Performed online/offline (physical) backup/recovery using RMAN with user-scripts, and logical backup/recovery using Data Pump Export/Import (expdp/impdp, exp/imp), SQL-loader (sqlldr) utilities.
  • Tuned database servers - Managed disk spaces and resources including resumable space allocation, transportable tablespace/database, and segment shrink.
  • Configured Flash Recovery area for quick recovery. Created encrypted, multi-section, compressed backup sets, backups with Long-term retention, and incremental backups.
  • Duplicated databases using RMAN, and Enterprise Manager. Performed Tablespace Point in Time Recovery TSPITR to recover data from corrupted/deleted rows, incorrect DDL.

Confidential

Sr. Software Engineer

Responsibilities:

  • Contributed in the development and enhancement of large-scale project CAATS as a member of Data Engineering team of over 16 software engineers. This successful project is now called AutoTrac III.
  • Created and managed the Conceptual, Logical, and Physical models for an Enterprise level Transaction Processing application.
  • Introduced Data Warehouse concepts and created Logical and Physical models for a small project.
  • Managed Data Models (Logical/Process models, ER diagrams, Business Rules, and Physical designs (DDL)) using Rational Case tool and Oracle Designer.
  • Developed a lot of PL/SQL stored procedures, SQL UNIX scripts to implement Business Rules.
  • Created several reusable modules in UNIX, SQL, PLSQL, and Pro*C to implement Process Models.
  • Created database instances and backup recovery procedures. Managed database storage, tuned parameters and briefed new DBAs on the Oracle products used and storage requirements.
  • Developed PL/SQL modules to Extract, Transform and Merge the Aeronautical and Radar data into CAATS.
  • Designed and developed several reusable modules to improve process time using dynamic PL/SQL, SQL, and UNIX. It improved the software development life cycle. In some cases, they reduced the processing time by around 80%, while providing better accuracy.
  • Authored unit test Plans and Procedures to ensure the software code unit correctly implements the logical design. Created unit test Drivers, executed test drivers and validated the output with the expected output. Generated the pass/fail reports. Performed these unit test activities for each software code developed.
  • Prepared Software Integration Test (SIT) and Factory Acceptance Test (FAT) procedure documents ensuring business requirements are met. It includes Requirement analysis, test outline and test procedures.
  • Presented the benefit of removing and storing the inactive CAATS data into the off-line database for 5 to 8 years while providing ad-hoc query functionality to the end-user. Proposed Data Warehouse (DWH) with BI tools to achieve these objectives.
  • Participated in software development processes improvement activities while learning Six Sigma also self-trained on CMMI level 1 and 2.
  • Worked at customer ( Confidential ) facility to help them deploy CAATS at National level by providing hands-on training of Software Development Life Cycle and enhancements of the unique requirements.

Confidential

DBA, System Admin, Developer (Systems Analyst)

Responsibilities:

  • Worked in Surveying Services Division with Data Management Group of 4 and contributed in developing GIS applications, performed Database administration of Oracle 7, System / LAN administration of Windows NT, SCO UNIX servers with around 40 client workstations.
  • Installed and configured Oracle7 on UNIX, Created backup restore scripts and managed data storage. Setup TCP/IP networks, Created NFS mounts between UNIX, NT, and DOS. Implemented DHCP on NT workstation for TCP/IP addresses. Setup gateways to access the novel network from TCPIP LAN computers.
  • Developed several GIS applications: using INTERGRAPH products: MGDM, MGE, DBAccess, MicroStation, MDL, and C/C++ languages.
  • Survey Data Database System (SDDS) to manage Geodetic Survey Points and Base Line Adjustment data.
  • Digital Mapping Database Interface System (DMDIS) managing the photogrammetric drawings in 3D with advanced capabilities of providing geographic data update history for each element.
  • Bulk Data Conversion of geographic data from several formats (ASCII text, MGE (tiled), and database tables) to Enterprise GIS (DMDIS) application format with help from a senior GIS consultant from the USA. All data was extracted, cleaned, topologically structured and loaded. It saved several months of work.

We'd love your feedback!