We provide IT Staff Augmentation Services!

Data Engineer Resume

2.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • 9+ years of strong experience as a Software Professional with hands - on experience in System Analysis, Design, Development, Implementation/ Support and Testing using ETL and BI Tools
  • Extensively experienced in Informatica PowerCenter, Unix shell scripting and SQL
  • Designing Data warehouses using with clear understanding of multi-dimensional data modeling, Power exchange, Shell scripting and SQL (Oracle, Netezza, DB2 and Teradata databases)
  • Extensive experience in Data warehousing development and maintenance projects using ETL technology, and Performed project management activities like Estimations, Project Planning, Scheduling, Deployment, Tracking, Resource Management, Coordination of Team in onsite and offshore and Building strong experience in client relationship
  • Proficient in data warehousing concepts, data modeling, dimensional star schema and snowflakes schema methodologies, implementing slowly changing dimensions, converting legacy into enterprise environment
  • Good in troubleshooting Informatica jobs, enhancement of jobs and addressing production issues like performance tuning and enhancement
  • Good at analyzing the Cobol copy books and developing the data maps by using the Power exchange
  • Optimized Informatica jobs utilizing parallelism, partitioning concepts in Informatica
  • Excellent hands on experience in Data Extraction, Loading and Analysis using Cloudera Platform (HDFS, Hive, Sqoop)
  • Extract data from Oracle into Hive using Sqoop
  • Extensively worked with Partitions, Dynamic Partitioning and Bucketing tables in Hive and designed both Managed and External tables and worked on optimization of Hive queries
  • Experience working with various formats in Hive (Text, Sequence files, AVRO, Parquet, RC and ORC).
  • Ability to work in tight schedules and efficient in meeting deadlines.
  • Independent developer with excellent Technical, Analytical and Communicational skills
  • Worked in Waterfall and Agile methodologies

TECHNICAL SKILLS:

Languages: Java, Python and SQL

ETL Tools: Informatica power center 8.x,9.x,10.1,10.2 and Informatica Power exchange

Reporting Tools: Tableau

Database: Oracle, Teradata, DB2, Netezza & Apache Hadoop

Version Control Tools: IBM Rational Team Concert and Service now

Testing: Unit Testing and Performance testing.

Scheduler tools: Control M and Tivoli scheduler tools for Informatica jobs

PROFESSIONAL EXPERIENCE:

Confidential

Data engineer

Responsibilities:

  • Designed and Developed the applications using Informatica, Shell scripting, Oracle and Teradata.
  • Designed and developed ETL jobs using Informatica Power center to extract the data from Teradata transformed based on business requirements and loaded to Data Warehouse.
  • Designed and developed the Informatica jobs to read the data from flat files and loaded into data warehouse.
  • Performed the unit testing, system testing and integration testing of the applications.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Normalizer transformations.
  • Have experienced in troubleshooting Informatica jobs, enhancement of jobs and addressing production issues like performance tuning and enhancement
  • Developed shell scripts to source file watch, parameter file creation and informatica jobs completion notification mail with statistics
  • Developed python script to call the URL (CMS website) and parsing the JSON response
  • Involved in brainstorming, consultation and troubleshooting of several technical problems faced during the data migration and application development processes.
  • Responsible for Documenting the requirements, design and coding changes. Maintained versions for the code for every change. Followed approval process to migrate the changes from UAT (User acceptance Testing) environment to Production
  • Involved code migration activities to higher environments

Environment: Informatica power center 9.x,10.1,10.2, Oracle, Shell script, Python, IBM Rational Team Concert and Tivoli scheduler.

Confidential

Data Engineer

Responsibilities:

  • Implemented solutions for ingesting data from SQL and Oracle sources into Hadoop Data lake utilizing Big Data Technologies such as IBIS Framework, HDFS, Hive, Sqoop etc.
  • Extensively worked with Partitions, Dynamic Partitioning tables in Hive and designed both Managed and External tables and also worked on optimization of Hive queries.
  • Wrote Hive Queries to have a consolidated view of the provider data and Involved in troubleshooting errors in Hive scripts
  • Exported the analyzed data to the relational databases using Sqoop to generate business reports.
  • Actively participated in Data Migration Projects working closely with business users and stakeholders.
  • Engaged in developing of mapping documents used for the development and designing of ETL mappings.
  • Involved in brainstorming, consultation and troubleshooting of several technical problems faced during the data migration and application development processes.
  • Created Informatica mappings to Extract, Transform and load (ETL) data from SQL Server and Oracle using different Informatica transformations such as Source Qualifier, Expression, Lookups, Joiner, Router, Update Strategy, Sequence generator, Filter, Aggregator, Union etc.

Environment: Hadoop 2.0, Hive1.2, Sqoop 1.4.3, IBIS, Python 2.7, Oracle 12c, Informatica 10.1

Confidential

Senior BI Developer

Responsibilities:

  • Performed the feasibility study, impact analysis, prepared the high and low-level design documents, and detailed technical design document
  • Involved in collating and defining business requirements for the data warehouse and preparation of technical design build document based on business requirements, Development, Execution, Testing, Defect fix and CR implementation phase of the project
  • Performed Data cleansing in the source and loaded into staging tables for each data conversion
  • Participated in project groups and meetings to fully understand new business processes and identify any new data/reporting requirements resulting from these projects.
  • Designed and developed ETL mappings using Informatica Power center to extract the data from multiple sources like Flat files, Oracle, Mainframe, CSV, Delimited files transformed based on business requirements and loaded to Data Warehouse.
  • Involved in Peer review of code and participated in the overall systems testing to support the implementation of the application into production environment.
  • Developed the complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Stored Procedures, XML transformations, Normalizer transformations and Java transformation.
  • Developed a reusable Google API package which will fetch latitude and longitude information for address record updates
  • Validated legacy source files before Job execution by using Informatica Power Exchange
  • Involved in code migration activities (Code migration to production and testing environments) and pre-production and post production go live activities.
  • Performed impact analysis of change requests based on revised requirements and implemented the CR
  • Have experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results.
  • Involved in tracking and implementation of defects and Change requests

Environment: Informatica 8.6,9.01 and Oracle 11g, Informatica power exchange 9.1, Java and Shell script.

Confidential

Senior BI Developer

Responsibilities:

  • Involved in preparation of technical design build document based on business requirements, Development, Execution, Testing and Defect fix phase of the project
  • Prepared the design documents High level Design and LLDs
  • Actively participated in project groups and meetings to fully understand new business processes and identify any new data/reporting requirements resulting from these projects.
  • Analyzed the COBOL programs and prepared the mapping documents with transformation rules and data flow rules.
  • Developed the informatica jobs to replace the mainframe jobs to load the data into Datawarehouse.
  • Designed and developed ETL mappings using Informatica Power center to extract the data from multiple sources like Mainframe files transformed based on business requirements and loaded to Data Warehouse
  • Developed the data maps in power exchange navigator by using the COBOL copy books to load the data from mainframe files
  • Worked extensively with complex mappings using different transformations like Application Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Normalizer transformations
  • Developed shell scripts to generate the parameter files for daily incremental loads.
  • Developed a shell script to run the informatica jobs from Control M job scheduler
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results
  • Performed the unit testing and involved in code review activities

Environment: Informatica 9.1, Informatica power exchange, DB2, Cobol, shell scripts, Service now and Control M

Confidential

Informatica Developer

Responsibilities:

  • Involved in preparation of technical design build document based on business requirements, Development, Execution, Testing and Defect fix phase of the project.
  • Prepared the design documents High level Design, LLDs, SAD, SID and actively participated in project groups and meetings to fully understand new business processes and identify any new data/reporting requirements resulting from these projects.
  • Designed and developed ETL mappings using Informatica Power center to extract the data from multiple sources like Mainframe files transformed based on business requirements and loaded to Data Warehouse
  • Developed the data maps to load the mainframe data(binary) by using the power exchange navigator
  • Developed webservice consumer services informatica jobs
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Normalizer transformations and Webservice consumer transformations
  • Participated in the overall systems testing to support the implementation of the application into production environment
  • Involved in tracking and implementation of defects and provided daily and weekly status reports to the onsite and client business teams

Environment: Informatica 8.6, Informatica power exchange, DB2, Cobol and Webservices

We'd love your feedback!