Data Engineer Resume
Mclean, VA
SUMMARY
- A responsible and committed engineer with more than 6 years of experience designing, implementing and adapting to technically sophisticated applications with Experience in all aspects of development from initial implementation and requirement discovery, through release, enhancement and support SDLC and Agile techniques.
- Proficient in AWS cloud, Oracle SQL, PL/SQL programming, ETL (Extract, Transform & Load) techniques for Data Processing & Analysis, Data modeling and Data visualization technologies.
- Experience in analysis, design, development, implementation and testing of Oracle applications in data warehousing and client/server environment.
- Extensive knowledge in AWS, related to data migration to AWS cloud.
- Experience in Oracle SQL and PL/SQL including all database objects: Stored procedures, Stored functions, Packages, TYPE Objects, Triggers, cursors, REF cursors, Parameterized cursors, Views, Materialized Views.
- Experience of snowflake (cloud - based data-warehousing) and Data migration from existing Data Lakes to Snowflake.
- Experienced in working on RDBMS, OLAP, and OLTP concepts.
- Fine understanding of Data modeling (Dimensional & Relational).
- Research-oriented, proactive, self-starter with strong technical, analytical and interpersonal skills.
- Excellent problem-solving skills, creative, adaptive and result-oriented professional with avidness to learn. Proficient at working as a team player with an aim of contributing towards the team's success.
TECHNICAL SKILLS
Database Technologies: Oracle Exadata,10g, MySQL, SQL, PL/SQL
Operating Systems: UNIX (Linux, Solaris, RHEL, Centos, Ubuntu), Windows
Amazon Webservice Technologies: S3, EC2, EMR, SNS, SQS, IAM, KMS, RedShift, CloudFormation, Cloudwatch, Lambda, VPC, Route53, RDS, DynamoDB, Kinesis, Snowflake.
DevOps tools: Jenkins, GitHub, Nexus, JIRA, Zabbix
Languages: Python(basics), SparkSQL, HTML/XML, JavaScript, JSON
IDEs: PyCharm, Anaconda
BigData: Apache Hadoop, Spark, HDFS, Pig, Hive, YARN, Sqoop, Grafana, Hue, Datacompy, Snowfox
Business Intelligence and Reporting Tools: Tableau, Qlikview, AWS Quicksight, OBIEE
Other tools: Putty, WinSCP, Postman, Gitbash, ServiceNow
PROFESSIONAL EXPERIENCE
Confidential - McLean, VA
Data Engineer
Responsibilities:
- Involved in full life cycle of the project from Design, Analysis, logical and physical architecture modeling, development, Implementation, testing.
- Supported the Daily LCR, IRR, NSFR and FTP use cases.
- Analyze business requirements and design, map and load data from various sources like flat files, XML files to Target database.
- Worked with Business partners to gather the requirements and come up with modern data pipeline designs.
- Involved in planning the AWS architecture for moving the existing infrastructure and process on to the cloud platform.
- Designed and developed data pipelines to move financial data, sourced from multiple LOBs on to the AWS cloud platform (datalake on S3) for downstream business users to execute their python models.
- Leveraged serverless Lambda service and an event-based architecture to orchestrate the data delivery with inbuilt technical and business data quality checks.
- Used SparkSQL and PySpark to perform data transformations. Scala was leveraged to write automated test scripts for system testing.
- Developing Oracle objects (Packages, Procedures, Functions, etc.) and integrate them with OFSAA front end using OFSAA user interface/OFSAA application suite.
- Involved in troubleshooting, debugging, problem solving and tuning for improving performance of the backend application programs.
- Responsible for translating the business requirements into technical specifications by creation Jira stories.
- Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL design specifications.
- Involved in Sprint/PI planning with business users for designing and planning a data ingestion strategy.
- Anchored production deployment activities adhering to the Change Management process.
- Involved with Data life cycle management design process.
Confidential - Pittsburgh, PA
Oracle Application Developer
Responsibilities:
- Involved in SDLC gathering requirements from end users. Developed views to facilitate easy interface implementation and enforce security on critical customer information.
- Developed stored procedures and triggers to facilitate consistent data entry into the database. Written Stored Procedures using PL/SQL and functions and procedure for common utilities.
- Participated in system analysis and data modeling, which included creating tables, views, indexes, synonyms, triggers, functions, procedures, cursors and packages.
- Resource for interfaces data to Oracle HRMS through JDBC and oracle packages
- Developed Java coding for Oracle HRMS
- Using Forms, developed customized bug systems, and New hire process.
- Involved in performance tuning of targets, sources, mappings, and sessions.
- Assisting QA Teams for bug testing, User Acceptance Testing (UAT) & System Integration Testing.
Confidential, Irving, TX
Oracle Developer
Responsibilities:
- Created stored procedures, functions, packages, collections, triggers, object types to implement complex business functionality. Data optimization, scrubbing and manipulation of staged data using PL/SQL packages. Use of PL/SQL bulk collection and DML (insert/update/delete) methods.
- Worked extensively in database development like developing Triggers, Functions, reports and forms.
- Oracle Applications modules HRMS Oracle, PL/SQL, SQL*Plus, Forms, Reports Skills.
- Encapsulated error handling and use of autonomous transaction for logging. Used UTL FILE for exporting data and UTL MAIL to generate E- mails. Extensively used PL/SQL for high performance of stored procedures.
- Assisted with testing existing code and making necessary enhancements for Oracle 11G upgrade. Created database objects such as tables, indexes, views, triggers, sequences, synonyms.
- Extensively worked on writing complex SQL queries (cursors, ref cursors, sub queries, correlated sub queries). Provided support for multiple business groups and managed SDLC for multiple projects to satisfy business needs.
Confidential, San Diego, CA
Systems Analyst
Responsibilities:
- Developed user defined functions based on requirements. Developed backend interfaces using PL/SQL Stored Packages, Procedures, Functions, Collections and Triggers. Created PL/SQL procedures and packages. Customized Reports and Forms to meet user’s requirements
- Extensive query optimization and performance tuning. Created the integrity constraints and Database Triggers for the data validations. Used PL/SQL Tables and Records in payment generation process. Written Test Plans and Involved in unit testing.
- Created relevant staging tables to load the CSV files, identified the business validation rules. Created SQL Loader script generator application using UNIX shell scripting and PL/SQL.
- Experience in creating the tables and sequences for the experimental data load capture. Loaded the data into the tables using TOAD and SQL*plus.
- Created metadata validation lookup tables and pre-populated them using SQL Loader generator application.
- Worked extensively in database development like developing procedures, packages, triggers, functions, Discoverer, XML Publisher, reports and forms.
- Designed competent pl/sql programs using pl/sql collections, Records types, and Object types.
- Good knowledge on joins, which are inner join, left outer join and Hash join, Semi join.
Confidential
Programmer
Responsibilities:
- Analyzed the business requirements for the enhancements needed in the contract Administration application and related policy modules.
- Documented the detailed requirements and prepared DLD for the new enhancements.
- Performed several DDL, DML and TCL operations.
- Created huge database packages with related functions and procedures.
- Extensive PL/SQL programming including Development of new backend packages, procedures and functions to in corporate advanced modules.
- Worked on BULK COLLECT for bulk loading the data into various transaction tables.
- Used UTL FILE to load the data into oracle tables from FLAT, CSV and Text files.
- Added database triggers to some history tables of the database.
- Refresh the development and test databases using export/import utilities.
- Provided 24*7 on call support to production environment.
- Implemented various automated UNIX shell scripts to invoke pl/sql anonymous blocks, Stored PROEDURES/FUNCTIONS/PACKAGES using SQL PLUS session in silent mode.
- Worked on inline and correlated sub queries based on business context.
- Implemented various customized Oracle reports using different techniques in Oracle sql/pl-sql.
- Extensive and in-depth analysis of the problem tickets to dig out the root cause for the problems.
- Involved in Impact analysis of the defects to the associated functionalities and modules in the Production environment.