We provide IT Staff Augmentation Services!

Sr Etl Engineer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • 7+ years of expertise in End - to-End Implementation of Data Warehouse and Business Intelligence applications for clients in major industry sectors like Health Care, Insurance, Banking and financial.
  • Well expertise indefining, designing, integrating and re-engineeringthe Enterprise Data warehouse and Data Martsin different environments like Teradata, Oracle, Hadoop withTerabytes of sizeand various levels of complexity
  • Involved in CompleteSoftware Development LifecycleExperience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Strong experience in Dimensional modeling using Star and snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modeling.
  • Very strong in writing complex SQL queries and performance tuning.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load, TPT to export and load data to/from different source systems including flat files
  • Experienced inUNIX shell scriptsto write wrapper scripts.
  • Worked Extensively on Informatica 9.5/9.1/8.6.1/8.1 - Power Center Client tools - Designer, Repository manager, Workflow manager/monitor and Server.
  • Good knowledge on Informatica IDQ, Data quality profiling, cleansing, Standardizing.
  • Expertise knowledge in Power exchange and creating data maps for Mainframe Cobol files.
  • Excellent understanding in Big Data / Hadoop Ecosystem, technologies which includes HDFS, Map Reduce, Hive.
  • Designing and creating Hive external tables defined with appropriate partitions, intended for efficiency.
  • Experience in writing Hive Query Language and analyzing data using HiveQL.
  • Experience in writing PIG scripting and analyzing data using Pig queries.
  • Expertise in moving data from/to HDFS eco system to/from local system.
  • Expertize in analyzing business data, write analytical queries and provide reports on data analysis.
  • Expertise in working with relational databases such as Oracle, SQL server and Teradata.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistantand Query man.
  • Proficient in performance analysis, monitoring and SQL query tuning usingEXPLAIN PLAN,Collect Statistics, and Hints.
  • Knowledge on Java and expertise knowledge on Agile methodology.
  • Hands on with using GitHub for code management.
  • Expertise in different scheduling tools like control M, Dollar universe, UC4.
  • Hands on experience in supporting production jobs and giving quick fixes.
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring jobs as well as performance tuning of mappings and sessions.

TECHNICAL SKILLS

Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2005/2000, Teradata 13.

ETL Tools: Informatica Power Center 8.X/9.X, Informatica Power exchange, B2B

Hadoop -BigData: HadoopEco System HDFS YARN, MapReduce, Hive, Pig

Programming/Scripting Languages: Java, Unix Shell scripting

Tools: Toad, Oracle SQL Developer, Pl/Sql developer, Putty.

Scheduling tools: Control-M, Dollar universe, UC4

Operating Systems: UNIX, Linux, MS Windows NT/2000

PROFESSIONAL EXPERIENCE

Confidential

SR ETL Engineer

Responsibilities:

  • Understanding the Requirement Specifications and discussion with PO for further drill down and reqs.
  • Writing Teradata merge and Bteqs scripts to load and update data in Mozart and panther Teradata systems.
  • Writing Teradata fast load and multi load scripts to load data from different sources into Mozart and panther Teradata systems
  • Pushing files from local system to HDFS eco system.
  • Load data from HDFS to hive tables.
  • Developed Informatica mappings, created tasks, sessions and workflows.
  • Development/Coding, Maintenance/enhancements. Manual unit testing and validations checkup.
  • Responsibilities include designing the documents which is based on the requirement specifications.
  • Participate in the development of the system solutions to bring resolution to the project.
  • I am responsible to analyze and give permanent fixes on recurring issues in system. To take care of system abends

Environment: Informatica power center 8.6/9.5, Oracle SQL, Java, PL/SQL, Teradata 13.10, Hive, HDFS, UNIX, control M, B2B, power exchange.

Confidential

Data integration (ETL) engineer

Responsibilities:

  • Analyze requirements and development for next phases of APAC CRM solution hosted in Digital Alchemy
  • Support existing daily incremental loads, issue and fix
  • Become the SME for data attributes that are needed from DT to APAC business unit and DA team
  • Support and manage the ongoing needs for SMB CRM data hosting

Environment: Informatica power center 8.6/9.5, B2B, power exchange, Oracle SQL, Java, PL/SQL, Teradata 13.10, Hive, HDFS, UNIX, control M.

Confidential

Data integration (ETL) engineer

Responsibilities:

  • Writing Teradata merge and Bteqs scripts to load and update data in Mozart and panther Teradata systems.
  • Writing Teradata fast load and multi load scripts to load data into Mozart and panther Teradata systems
  • Extraction of data from various sources using Informatica.
  • Developed Informatica mappings, created tasks, sessions and workflows.
  • Development/Coding, Maintenance/enhancements.
  • To work as part to support IT department. To handle support functions by implementing code changes, testing them and deployment to production.
  • Pushing files from local system to HDFS eco system.
  • Load data from HDFS to hive tables.
  • Responsibilities include designing the documents which is based on the requirement specifications.

Environment: Informatica power center 8.6/9.5, B2B, power exchange, Oracle SQL, Java, PL/SQL, Teradata 13.10, Hive, HDFS, UNIX, control M.

Confidential

ETL Engineer

Responsibilities:

  • Ensure regulatory compliance by avoiding errors and by providing a complete audit trail of data flows and transformations
  • Developed mappings, created tasks, sessions and workflows.
  • Unit testing and system integration testing of the developed code.
  • Providing fixes to the issues identified in testing phase.
  • Provide production support for the deployed project till it is stabilized
  • Extraction of data from various sources using Informatica.
  • To keep track of historical data slowly changing dimensions are implemented.
  • Being responsible for co-ordination with Client in task delivery in terms of timely query resolutions, follow-up and being within the allotted deadline
  • Fixing the Issues during the Upgrade like Informatica mappings, mapplet, Sessions, workflow and worklet

Environment: Informatica power center 8.6/9.5, power exchange, Oracle SQL, DB2, PL/SQL, Unix, Dollar universe scheduling tool

Confidential

Data mart ETL consultant

Responsibilities:

  • Fixing the Issues during the Upgrade like Informatica mappings, mapplet, Sessions, workflow and worklet
  • Work with the Analyst teams and assist them to create Optimized queries.
  • Fulfill data requests and strategies as per business needs/requirements.
  • Created best practice documents based on the general ideas and identified the area of improvement.
  • Wrote Test Cases and Perform Unit, Integration test.

Environment: Informatica power center 8.6/9.5, power exchange, Oracle SQL, PL/SQL, Unix, Dollar universe scheduling tool

Confidential

ETL developer

Responsibilities:

  • Developing new business module on ETL.
  • Ensure regulatory compliance by avoiding errors and by providing a complete audit trail of data flows and transformations
  • Developed ETL mapping and PL/SQL code as per the business requirements
  • Providing fixes to the issues identified in testing phase.
  • Provide production support for the deployed project till it is stabilized
  • Understanding project architecture via Knowledge Sharing

Environment: Informatica power center 8.6/9.5, power exchange, Oracle SQL, PL/SQL, Unix, Dollar universe scheduling tool

Confidential

Developer

Responsibilities:

  • Understanding the Requirement Specifications
  • Developing new business module on ETL.
  • Written complex SQL queries using joins, sub queries and correlated sub queries.
  • Improved performance of the database by creating database objects like materialized views, table indexes and partitions.

Environment: Informatica power center 8.6/9.5,Oracle SQL, PL/SQL, UNIX.

We'd love your feedback!