We provide IT Staff Augmentation Services!

Sr. Data Engineering Analyst - Etl Developer/data Analyst/cloud Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • IT Professional with 9+ years of experience as an ETL, ELT, Cloud & Data Engineer. Having extensive experience in ETL, AZURE Cloud Computing, Snowflake and Agile Methodologies.
  • Proficient in designing & developing strategies for Extraction, Transformation and Loading (ETL) mechanism.
  • Experience in ETL process, ETL design, Convert Logical to Physical data model, data analysis, requirement gathering, source to target mapping and Snowflake cloud data warehouse implementation on AWS
  • Exposed to all aspects of software development life cycle (SDLC) such as Analysis, Planning, Developing, Testing, implementing and post - production analysis of the projects.
  • Expertise in troubleshooting of DataStage jobs and addressing production issues like performance tuning and redesigning of Jobs, SQL’s, and data issues.
  • Experience in integration of various data sources (SQL Server, Oracle, Teradata, DB2, Hive, HDFS, AWS S3, Azure Storage, xml, json) into data staging area.
  • Built shell scripts to automate the day-to-day tasks like source file validations, data processing, Email notifications that reduce manual tasks.
  • Hands-on experience in troubleshooting production issues and performance tuning of ETL jobs
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment
  • Involved in creation and execution of Unit test cases and Document unit test results.
  • In depth experience in ETL Design Specification, DataStage Design & Development, Quality Process, Testing Strategy, Unit Testing, Code deployment.
  • Implemented Performance Tuning Techniques while designing ETL applications.
  • Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
  • Experience in developing ODI mappings/Reusable mappings using various Transformations (Filter, Expressions, Aggregative, Lookup, Set, Pivot) for Extraction, Transformation & Loading of data from multiple sources to Data Warehouse.
  • Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and Star/Snowflake schema modeling.
  • Snow pipe usage for Continuous Data Load and Created internal and external stage and transformed data during load.
  • Time Travel, Data Retention Settings for Storage for Crucial Tables and Resource Monitors setup on warehouses
  • Familiarity with Data Acquisition, Data Audit, Data Archival, Error & Rejection Handling, Data Profiling.
  • Experienced on Branching, Merging, and Tagging concepts in Version Control tool like GIT.
  • Extensively worked on Jenkins & OpenShift for continuous integration and end-to-end automation tools for deploying.
  • Expertise in integrating Jenkins with various tools like OpenShift (Build tool), Git (Repository), in implementing CI/CD automation for creating Jenkins pipelines.
  • Experience with container-based deployments using Docker, working with Docker images, Docker Hub.
  • Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on Azure.
  • Good understanding of the principles and best practices of Software Configuration Management (SCM) in Agile, SCRUM, Waterfall methodologies.
  • Hands-on experience using CodeHub, ServiceNow, Hp ALM, Airflow, Jenkins, GIT, OpenShift.
  • Strong focus on time-management, decision-making, and project delivery.
  • Good Interpersonal Skills, team-working attitude, takes initiatives and very proactive in solving problems and providing best solutions.
  • Ability to learn new skills and Can quickly adapt to a changing environment.

TECHNICAL SKILLS

Data Integration (ETL/ELT) Tools: IBM Info Sphere Information Server 8.5, 8.7, 9.1.2, 11.1,11.3, Informatica 9.1

Data Modeling: Erwin, Power BI, SAS, MS Visio

Databases: Teradata, DB2, MS SQL Server 2000, Oracle, Snowflake and Netezza

Reporting Tools: Power BI, Tableau, Cognos

Languages: SQL, PL/SQL, Shell Scripting, Python, C++

Operating Systems: Windows, Unix, Ubuntu, Linux

Scheduling Tools: Tivoli, Autosys, Airflow

Domain: Healthcare, Retail, Banking

Cloud/Container: Docker, VMware vSphere, OpenShift, Azure.

Other: TOAD, SVN, GIT, Jenkins, TFS, Json, Service Now, HPSM, ITSM, Openshift, HP ALM, CodeHub, GIT, GitHub

PROFESSIONAL EXPERIENCE

Confidential

Sr. Data Engineering Analyst - ETL Developer/Data Analyst/Cloud Engineer

Responsibilities:

  • Coordinate with Business Analysts (BA) to gather business requirements, evaluate the scope of design and technical feasibility.
  • Analyze complexity and technical impact of requirements to cope with the existing design and discuss with Business Analyst for further refinement of requirements.
  • Created High Level and Detailed Level design. Conduct design reviews and design verification. Create final work estimate.
  • End-to-end implementation, maintenance, optimizations, and enhancement of the application.
  • Code Review, Presenting code & design in Technical Review Board
  • Designed and developed jobs that extract data from the source databases using DB connectors Oracle, DB2 and Teradata.
  • Involved in creating SQL queries, performance tuning and creation of indexes.
  • Extensively used materialized views for designing fact tables.
  • Ensured that operational and analytical data warehouses can support all business requirements for business reporting.
  • Developed Unix shell scripts and worked on Perl scripts for controlled execution of DataStage jobs
  • Followed and used secure data transfer process to meet the PCI compliance.
  • Extensively designed jobs in TWS scheduler for execution of the jobs in sequence
  • Participated in DataStage Design and Code reviews.
  • Worked in Agile/Scrum environment.
  • Leading the team and ensuring the delivery happens on time with the best quality and least defects. This is important in living up to the Quality standards of Optum.
  • Used Import and Export in Web Infosphere Server Manager for importing the importing and exporting jobs/projects.
  • Verify production code, support first three executions of code in production, Transition of the code and Process to the Maintenance/Support team.
  • Co-ordinate with various Business partners, Analytical teams, Stakeholders to provide status reporting
  • Actively participated in the team meetings in day-to-day calls, meeting reviews, status calls, batch reviews, etc
  • All the UGAP jobs were migrated from TWS to Airflow, we were involved in end-to-end migration process and migrated successfully.
  • Created job parameters, parameter sets and shared containers.
  • Consistently used Transformer, Modify, Copy, Join, Funnel, Aggregator, Lookup stages and other processing stages to develop parallel jobs.
  • Generated Surrogate Keys for composite attributes while loading the data into the data warehouse.
  • Imported Metadata from the source database. Imported metadata definitions into the repository. Exported and imported Data Stage components.
  • Snowflake Zero Copy cloning - Cloning databases for DEV and QA environments
  • Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
  • Shared sample data using grant access to customer for UAT.
  • Time Travel, Data Retention Settings for Storage for Crucial Tables and Resource Monitors setup on warehouses
  • Data Sharing from Prod to Stage and Dev Environments
  • Fixing the data issues that occur during the ETL process

Environment: Data Stage 11.3, Unix/Linux, DB2 v10.1, TWS, ITG, SQL, BTEQ, Snowflake, Airflow, ADF, Azure DW, Jenkins, Ansible, Shell, Python, Docker, Artifactory, ECR, Airflow, Git, OpenShift, Unix

Confidential

Sr. Software Engineer - ETL Developer/Data Analyst/Cloud Engineer

Responsibilities:

  • Coordinate with Business Analysts (BA) to gather business requirements, evaluate the scope of design and technical feasibility.
  • Analyze complexity and technical impact of requirements to cope with the existing design and discuss with Business Analyst for further refinement of requirements.
  • Created High Level and Detailed Level design. Conduct design reviews and design verification. Create final work estimate.
  • End-to-end implementation, maintenance, optimizations and enhancement of the application.
  • Code Review, Presenting code & design in Technical Review Board
  • Designed and developed jobs that extract data from the source databases using DB connectors Oracle and Teradata.
  • Worked on Branching, Merging, and Tagging concepts in Version Control tool like GIT.
  • Build the Jenkins & OpenShift for continuous integration and end-to-end automation tools for deploying.
  • Building the pipeline for integrating Jenkins with various tools like OpenShift (Build tool), Git (Repository), in implementing CI/CD automation for creating Jenkins pipelines.
  • Have Setup Kubernetes cluster from scratch and as a service. Also hosted Kubernetes on cloud architecture with AKS. worked on container-based deployments using Docker, working with Docker images, Docker Hub.
  • Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on GCP.

Environment: Teradata, Data Stage 11.3, DB2 v10.1, TWS, HPSM, SQL, BTEQ, Airflow, Shell, Python, Docker, Artifactory, ECR, Airflow, Git, Kubernites, OpenShift, Unix

Confidential

Sr. Software Engineer - ETL Developer

Responsibilities:

  • Applied the Data Transformations, Data Cleansing activities in staging area.
  • Designed and developed mappings in Datastage to load target database.
  • Working on Datastage Designer for Extracting Source Database and transforming then load into Target Database.
  • Involved in writing UTC & UTR (Unit Test Cases).
  • Used various stages like sequential file, joiner, lookup, transformer,dataset, filter, aggregator, change capture, funnel, SAP ABAP stage, XML stages and columngenerator stages for designing the jobs in the Datastage Designer.
  • Implement SCD-Type2 Techniques.
  • Working on Datastage Director for running the jobs and monitoring the jobsand view the logs.
  • Designed and developed jobs that extract data from the source databases using DB connectors.

Environment: IBM DataStage 8.7, 9.1.1, Teradata 13.1.1, AIX 5.3, Linux 2.6.18, Oracle 10g, DB2, TWS, DB2, TOAD, SQL Plus, HPSM.

Confidential

Software Engineer - ETL Developer

Responsibilities:

  • I have been involved actively in the ETL Designing phase, so as to develop a process strategy and created respective Technical Design Documents and Functional Design Documents.
  • I was also responsible for Peer Reviewing and Testing ETL components developed by other team members in the team.
  • Worked on Databases Oracle 9i and PL/SQL scripts for the User defined function in DataStage.
  • Defect analysis and Fixing during various testing phases.
  • Developed complex ETL jobs involving multiple data sources including several RDBMS tables and flat files in development environment and exporting them to QA and Production environments.
  • Developed Reusable DataStage jobs for Extraction, Transformation and Loading.
  • Used the UNIX and Autosys tools for scheduling the jobs.
  • Worked on UNIX shell scripting to write wrapper scripts and various other scripts.

Environment: IBM DataStage 8.5,8.7, AIX 5.3, Linux 2.6.18, Oracle, DB2, TWS, HP ALM.

We'd love your feedback!