Cloud Migration Lead Developer/architect Resume
Rockville, MD
SUMMARY:
- Excellent communication skills, Good team player, highly motivated self - learner with ability to develop proficiency and adapt to new technologies and methods in a short period of time.
- Eighteen years of progressive experience in the Computer Industry as a Lead Software Developer with several complete project life cycles including Systems Analysis, Design, Development, Migration, Testing, Implementation, Production Support and interaction with the Business/Requirement Analysts, End users and Auditors.
- Big Data developer well experienced with handling Terabytes of data to implement surveillance algorithms and patterns analysis in Big Data Ecosystem.
- Experienced in migrating database/legacy applications to AWS cloud ecosystem using Services like VPC, EC2, S3, EMR, RDS for Compute, Big Data Analytics and Storage.
- Expertize in Data warehouse modeling, development, ETL tools like IBM Datastage 8.5 Grid / 8.0 / 7.5.x, and Massively Parallel Processing (MPP) Data warehouse Appliance like Netezza/Greenplum.
- Design and implemented enterprise level data warehouse solutions using DataStage 8.5 and Netezza (Mustang/twinfin).
- 12+ years of experience in financial domain.
- Worked extensively on Unix Operating System and Autosys Job Schedulers, created several batch jobs using Shell Scripts and Confidential scripts, Autosys Job scheduler and Unix Crontab for scheduling and execution.
- Worked on multiple projects and possess strong skills in Application Enhancements, Performance Tuning and Production Support and Highly experienced in converting data from legacy systems and flat files.
- Worked extensively in SQL (ANSI and Postgres) and expert in Trouble shooting the PL/SQL code and experienced in advanced features like Collection objects, Bulk collection, Dynamic SQL, Ref Cursor, XMLDB features, Autonomous Transactions, Partitions, Global temp tables, Pipeline and Table functions, Cursor expressions and various types of Indexes.
TECHNICAL SKILLS:
Big Data: Hadoop, MapReduce, TEZ, AWS-EMR 3.9/5.3, Hive 2.1.0, Spark 2.0.2, SPARK-SQL.
Languages: SQL (ANSI and Postgres), PL/SQL, C, C++, Pro*C, UNIX Shell Scripting, Java, PERL5.0, Confidential 3
RDBMS: ORACLE 7.x/8i/9i/10g/11g, SQL Server 6.5/7.0, MS Access, MySQL4.1
ETL: Ascential DataStage 7.5. IBM DataStage 8.0/8.x, Netezza (Mustang/Twinfin)
Operating Systems: Sun Solaris, HP-UX, Windows XP/NT/2000, Linux
Tools: & Utilities Autosys, TOAD 7.x, SQL*Plus, SQL*Loader, Forms & Reports 6i/9i, Erwin 4.0, Oracle Designer 2000/6i/9i, Export, Import, Delphi, Test Director 7.6, PVCS, Rational ClearCase / ClearQuest, MS Office 2007, Visual Basic 6.0, ER Studio 7.0, Mercury Quality Center 9.x, Borland StarTeam, MS Visio, Subversion, Git.
Java & Web Technologies: JMS 1.0.2, JAXP1.1, JNDI 1.2.x, JDBC, HTML, XML, DTD & Schema and XSL.
PROFESSIONAL EXPERIENCE:
Confidential, Rockville, MD
Cloud Migration Lead Developer/ArchitectResponsibilities:
- Cloud Migration Lead Developer/Architect for Confidential ’s existing Surveillance Pattern application migration from Netezza/GreenPlum/Oracle On-Premise to AWS Cloud system using AWS services like VPC, EMR, S3 and Big Data analytic tools.
- Create Hive external tables on top of S3 data, adding partitions, coding as per the requirements using Hive (HQL), Java (UDFs) and Shell scripting.
- Analyze pattern requirements, create user stories and sub-tasks and distribute story points and expected hours for the subtasks on JIRA.
- Migrate Surveillance pattern jobs in HIVE (MapReduce & TEZ) to SPARK 2.0.2 SPARK-SQL.
- Re-Write several slow performing HQL quires or application to fast and cost efficiently performing during Post-Migration.
- Analyze the Jobs In detail to determine right instance/cluster type (Storage/IO/Compute/Memory Optimized) for cost efficient performance.
- Wrote custom scripts to manage and resize the EMR cluster on the fly
Work Environment: AWS EMR 3.9/5.3, Hadoop, HIVE 2.1.0, MR, TEZ, SPARK 2.0.2 SPARK-SQL, Postgress SQL(RDS), AWS S3, Netezza, Greenplum, ORACLE10g/11g, Windows NT, Unix Shell Scripts, Subversion, GitConfidential
Senior ETL lead
Responsibilities:
- Senior ETL lead for Confidential Exception Generation and Alert Generation processes using Oracle, Netezza and Greenplum
- Implement large and complex PL/SQL, SQL Quires that runs against Oracle, Greenplum and Netezza database s to efficiently perform complex business logic to summarize data and generate exceptions and alerts
- Plan and implement huge data migration activities across different data warehouse appliances. Familiar with different native tools that are available at the appliance level to move terabytes of data daily.
- Application DBA activities like optimising queries before they go live, analyse/fix the slow running queries in production with utilities like Explain, Trace, Stored Outlines, analysing and preparing schema statistics, data preparation by import/export from external databases in Dev and QC databases, Parallel query monitoring parameter setting
Work Environment: Netezza (Mustang/Twinfin), Greenplum, ORACLE10g/11g, PL/SQL, TOAD, Windows NT, Unix Shell Scripts, Subversion, MS Visio
Confidential
Senior/Lead ETL and Back-End Developer
Responsibilities:
- Senior/Lead ETL and Back-End Developer for Confidential projects SONAR (Securities Observation News Analysis and Regulation) and ERD (Enterprise Reference Data)
- Designed and developed the ETL processes to transfer high volume data from various sources (Oracle/ Netezza /Greenplum) into destinations using DataStage and custom scripts
- Developed and implemented ETL Applications to extract trade data from sources like NASDAQ and load into SONAR / ERD tables for computation and Analysis
- Lead Datastage jobs migration and setup from 7.5 to DS 8.5 GRID Environment.
- Designed and developed batch jobs to load data into Netezza database using nzload utility and implemented business rules using Postgres SQL.
- Developed Application components for Summarizing and aggregating the trade data daily, weekly and monthly for BREAK generation process.
- Developed Application Components for applying or rolling back stock issues Splits and Dividends for the SONAR derived price and Volume data.
- Developed Batch jobs and set the job dependencies using Autosys Job Scheduler
- Fine tune the Applications for Better performance
- Developed scripts for Data migration from legacy applications to New applications
- Re-Write many SONAR/ERD components for efficiency and new requirement
- Designed and developed oracle packages that contain Procedures, Functions and Object Types to Validate the data in the De-Normalized staging table, Bulk update with the repaired data, and Bulk load the data in to the SONAR/ERD tables, Using Oracle features like, Object Types, Collection of Object types, Ref Cursor, Bulk Collection, Bulk Binding, Returning Bulk collection, FORALL loop and Table function, MERGE.
- Optimizes the SQL’s for optimum performance by using SQL Trace, TKPROF, EXPLAINPLAN, Oracle Hints and appropriate Indexes. Utilized features like Partitioned tables and Indexes, Global temporary tables for better performance
- Developed and Implemented Unit test Scripts and Fame work.
- Responsible for Production support and Application maintenance for SONAR/ERD
Work Environment: DataStage 7.5/8.0/8.x, Netezza (Mustang/Twinfin) ORACLE 9i/10g/11g, PL/SQL, TOAD, Windows NT, Unix, Mercury Quality Center, SCM, MS Visio, Erwin, Confidential, Java, XML
Confidential, Rockville MD
Database Architect
Responsibilities:
- Confidential is the nation’s largest Healthcare Providers network company with one million healthcare providers like Hospitals, Nursing home, Labs and Healthcare Practitioners like Doctors are under contract. Confidential primarily engaged in the re-pricing of health insurance claims for Health Insurance companies, approximately 70 million healthcare claims processed each year.
- Designed and developed oracle packages that contain Procedures, Functions and Object Types to Validate the data in the Denormalized staging table, Bulk update with the repaired data, and Bulk load the data in to the Normalized tables, Using Oracle features like, Object Types, Collection of Object types, Ref Cursor, Bulk Collection, Bulk Binding, Returning Bulk collection, FORALL loop and Table function etc.
- Designed and Developed Pipelined functions that return collection of custom designed oracle objects and wrote spooling scripts that use TABLE function over the collections to spool data in to text files.
- Wrote complex and efficient queries with parallel hint to select data from the source table for bulk insert into normalized destination tables.
- Optimizes the SQL’s for optimum performance by using SQL Trace, TKPROF, EXPLAINPLAN, Oracle Hints and appropriate Indexes. Utilized features like Partitioned tables and Indexes, Global temporary tables for better performance.
- Developed SQL*loader scripts to load the data from flat files to the Oracle Tables.
- Developed Unix Shell Scripts for batch processing, data loading and extraction in parallel.
- Modified the existing Oracle packages and scripts for the new requirements and supported DBAs for the deployment in Develop Test, QA Test, Compliance Test and Production environments.
- Interact with Business Analysts for Use cases and Business Rules and involved in the development of Design Document
- Prepared Technical Specification and deployment documents
- Developed Unit Test Cases and performed unit test.
Work Environment: ORACLE 9i/10g, PL/SQL, TOAD, Windows 7, Unix (Linux & Solaris), Borland StarTeam, MS Visio, Subversion
Confidential, Herndon VA
Database Architect
Responsibilities:
- Developed Unix Shell and PERL Scripts for batch processing and data loading using SQL loader and SQL Scripts for various DML operations.
- Designed, developed, tested and deployed numerous Oracle Packages, standalone PL/SQL stored procedures and functions, DML and Database triggers for efficient database access, update, audit, and complex data integrity constraint implementations.
- Involved in maintaining and supporting all the environments for users, developers and testers.
- Used Collection objects, PL/SQL Object and Record Types, Bulk Collect, Dynamic SQLs, DBMS JOBs, DBMS UTL, Analytical Functions and XMLDB features like XMLType, XMLElement, XMLAttributes, XMLConcat, XMLSequence etc. For PL/SQL Programming.
- Designed, modeled, and implemented creation scripts for the various Oracle 9i database objects like Tables, Indexes, Views, Packages (Specifications and Body), Stored procedures Function, Triggers, Sequences, Synonyms, Constraints etc.
- Worked on creating an ETL process using Shell Scripts to load data from the SOX database to the reporting database.
- Developed SQL*loader scripts to load the data from flat files to the Oracle Tables
- Interact with Business Analysts for Use cases and Business Rules and involved in the development of Design Document.
- Developed Unit and Integration Test Cases
- Led the development team of Prospan 4.0a and 5.0a
Work Environment: ORACLE 9i/10g, PL/SQL (Stored Procedures, Functions, Packages and triggers), TOAD, Windows NT, UNIX, Clearcase/Clearquest, Unix Shell Scripting, Perl 5.0, Oracle XMLDB features and ER Studio 7.0
Confidential, Summit NJ
Database Architect Responsibilities:- Extensively involved in Analysis, Design, Development and Testing of Contact management, Order processing, Risk management, and Report Server.
- Designed, developed, tested and deployed numerous Oracle Packages, standalone PL/SQL stored procedures and functions, DML and Database triggers for efficient database access, update, audit, and complex data integrity constraint implementations.
- Developed Packages, which construct Ref Cursors and developed Forms and Reports based on Ref Cursor.
- Used PL/SQL Object Types, Bulk Collect, Dynamic SQLs, DBMS JOBs, DBMS UTL, and Analytical Functions for PL/SQL Programming.
- Created and maintained DBLinks, Synonyms, Sequences, Materialized Views, Global Temporary tables.
- Optimizes the SQL’s for optimum performance by using SQL Trace, TKPROF and EXPLAIN PLAN.
- Interact with Business Analysts for Use cases and Business Rules and involved in the development of Design Document.
- Developed Unit and Integration Test Cases.
Work Environment: ORACLE 8i/10g, PL/SQL, TOAD, Oracle Forms 6i/Reports 10g, Oracle 10g Streams, Oracle Warehouse builder, Windows NT, UNIX, Test Director 7.6, PVCS, and Erwin.
Confidential . Troy, MI
Database ArchitectResponsibilities:
- Interact with Database Architect and Business Analyst regarding design and business requirements for Combine Order management and verification process, P.O, Invoice, Inventory Management, Shipping, etc.
- Designed, modeled, and implemented creation scripts for the various Oracle 9i database objects (Tablespaces, tables, indexes, views, packages, stored procedures & function, and triggers) using SQL, and the PL/SQL programming language on Sun Solaris.
- Designed, developed, tested and deployed numerous Oracle 9i Packages, standalone PL/SQL stored procedures and functions, DML and Database triggers for efficient database access, update, audit, and complex data integrity constraint implementations.
- Developed numerous programs using PL/SQL and many packages to do the validations as per requirements.
- Optimizes the SQL’s for optimum performance by using SQL Trace, TKPROF and EXPLAIN PLAN.
- Developed control files for SQL *Loader, and loaded data into database tables from flat files.
- Performed Oracle database backups & recovery using export and import.
Work Environment: Oracle 9.2.0.4, PL/SQL, SQL*Plus, Sun Solaris 7.0, Windows NT/XP,