We provide IT Staff Augmentation Services!

Senior Etl Consultant Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Over 9 years of experience in IT Industry, with skills in Informatica Power Center ETL and other Reporting Tools.
  • Strong Technical Knowledge in Data Warehousing implementations, developing ETL mappings using Informatica Power Center 9.x
  • Strong Experience in Extraction, Transformation and Load (ETL) data from various sources into Data warehouses and Data Marts using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Metadata Manager & Repository Manager.
  • Expert in design & development various types of reports using Cognos 8.x/ 10.x.
  • Expert in Framework Manager, Cognos Connection, Report Studio and Query Studio tools, Workspace/Advanced of Cognos 8/10 BI.
  • Expert in creating visualization in customized and interactive dashboards in Tableau.
  • Expert in writing efficient SQL and PL/SQL code (Stored Procedure, Functions, Triggers and Packages) on PL/SQL Developer.
  • Performance tuning and Optimization of SQL.
  • Well versed in Data warehousing concepts and Involved in OLAP Data Modeling.
  • Developed logical data models, physical data models and Data Model changes using ERwin.
  • Strong understanding of Informatica Power Center architecture.
  • Experience in various data sources like MDM, AZURE, XML via web service and XML, Oracle, SQL server and non - relational sources like flat files into staging area.
  • Experienced in UNIX work environment, file transfers, job scheduling and error handling.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Good Exposure in Big Data Hadoop and familiar with Hadoop Cluster architecture, Eco system, HDFS, YARN, Hive, HBase, PIG, Map Reduce concepts and Schedule jobs using Oozie.
  • Utilized the Hadoop and its features to prepare the plan and to migrate from the existing system and integration of different applications.
  • Well trained and experienced in Business process, Application Development and Maintenance, SDLC for critical IT projects involving Business analysis, Project design, Domain skills, third-party vendors and cross-functional teams.
  • Strong analytical skills and proficient in creating Functional Specification documents (FSD), project plans, HLD and LLD documents.
  • Expertise in requirements gathering, software requirements specifications, systems development, testing (unit test, integration test and performance test), reporting, application maintenance, application / production support.
  • Excellent documentation and presentation skills, great verbal and written communication skills.
  • Analytical thinker, quick learner and proficient problem-solver who envisions business and technical perspectives to develop workable solutions.
  • Strong Leadership qualities & equally active Team member.

SKILL:

ETL: Informatica 9.x

Reporting: Cognos BI, Tableau, MS SSRS

Database: ORACLE 11g, SQL Server 2005

IDE Tools: SQL Developer, Toad, PL/SQL Developer, Virtual Machine, PuttyWinSCP, Jenkins, Java Eclipse, Visual Studio 2005

Data Modeling: Erwin 7.1

Languages: C, Java, .NET

SHELL Scripts: Unix

Defect Tracking Tool: QC, HP ALM, JIRA, TFS

Scheduling Tools: Control M, LCM

Operating System: Linux, Windows

EXPERIENCE:

Confidential

Senior ETL Consultant

Responsibilities:

  • Designed and developed Informatica Mappings based on business requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Prepared Release plan and Deployment checklist document for all the releases to ensure the post deployment verification is successful.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Implemented metadata framework using ELT methodology to load the data from data warehouse to presentation layer.
  • Wrote complex SQL queries using joins, sub queries and inline views to retrieve data from the database.
  • Involved in scheduling the workflows through Job scheduler (Control M).
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Designed an approach for migrating the existing legacy application into the new Informatica power center mappings.
  • Estimate the project implementation duration for enhancements and changes based on user requirement and technical feasibility.
  • Managed the implementation team and understanding the deployment activities to push the project to Production.
  • Worked on Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems and flat files, which includes fixed-length, as well as delimited files.
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation and Aggregator Transformation.
  • Performed performance tuning of source level, target level, mappings and session to manage high volume of data with DB size exceeding 5.5 TB.
  • Used Informatica debugger in finding out bugs in existing mappings by analyzing data flow and evaluating transformations and done unit testing for the individual developed mappings.
  • Involving in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Tools: Informatica 9.5.1, Oracle 10g, PL/SQL Developer and TOAD in Windows, Unix Shell scriptControl M

Confidential

ETL Developer

Responsibilities:

  • Estimate the project implementation duration for enhancements and changes based on user requirement and technical feasibility.
  • Managed the implementation team and understanding the deployment activities to push the project to Production.
  • Designed and developed Informatica Mappings based on business requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Integrated ETL (batch processing) that were both subscribed and fed as information into the Staging areas and in to the enterprise data warehouse.
  • Worked on Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems and flat files, which includes fixed-length, as well as delimited files.
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation and Aggregator Transformation.
  • Managing various data sources like MDM, AZURE via web service using XML Transformation, Oracle, SQL server and non-relational sources like flat files into staging area.
  • Performance tuning of source level, target level, mappings and session to manage high volume of data (in billions and with a storage capacity of DB size exceeding 3 to 4 TB for each project). Trouble shooting of long running sessions and fixing the issues.
  • Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes.
  • Created UNIX Shell scripts to FTP the files.
  • Responsible for creating and modifying the PL/SQL procedures, functions according to the business requirement.
  • Used Informatica debugger in finding out bugs in existing mappings by analyzing data flow and evaluating transformations and done unit testing for the individual developed mappings.
  • Creating visualization in customized and interactive dashboards in Tableau.
  • Utilized the Big Data Hadoop and its features to prepare the plan and to migrate Data from RDBMS to Data Lake using Sqoop, Hive. Involved in integration of different applications, and address both real-time and batch integration scenarios.
  • Importing and Exporting data into HDFS and Hive using Sqoop.
  • Created Pig Latin scripts to sort, group, join and filter the desperate enterprise wise data.
  • Defining Job flows, managing and reviewing the job log files.
  • Load and transform large sets of structured and semi structured data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in Map reduce way.
  • Data Model changes using Erwin tool.
  • Working closely with Business Analyst and Data Scientist to understand the functional and business needs. Creating the required functional and technical documents.
  • Involving in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Tools: Informatica 9.5.1, Hadoop, Oracle 11g, PL/SQL Developer, LCM, Tableau, Erwin 7.1

Confidential

Cognos BI Developer

Responsibilities:

  • Effort estimation for reports.
  • Gathering the detailed requirements from business users.
  • Client Interaction regularly to understand, gather requirements and change requests.
  • Preparing technical design document for the requirements.
  • Designed and developed Business Data model in Framework Manager RM and DMR based on Business Requirements.
  • Designing and developing reports using Report Studio.
  • Involved in Scheduling of reports using Event Studio.
  • Unit testing and Performance tuning of reports.

Tools: Cognos 8.4 BI Suite (Report Studio, Analysis Studio, Event Studio, Query Studio, Framework Manager), Oracle 10g

Confidential

Cognos BI Developer

Responsibilities:

  • Gathering the detailed requirements from business users.
  • Client Interaction regularly to understand, gather requirements and change requests.
  • Designed and developed Business Data model in Framework Manager RM and DMR based on Business Requirements.
  • Designing and developing reports using Report Studio.
  • Implemented unions and derived queries to improve reports performance.
  • Assigned permissions/capabilities in the IBM Cognos Administration interface for Groups and Roles created in LDAP.
  • Designed Sales Reports and Scorecards for Dealers from various regions.
  • Involved in Scheduling of reports using Event Studio.
  • Unit testing and Performance tuning of reports.

Tools: Cognos 8.4 BI Suite (Report Studio, Analysis Studio, Event Studio, Query Studio, Framework Manager), Oracle 10g

Confidential

SSRS Developer

Responsibilities:

  • Developed views and reports on SQL Reporting Services and SQL server 2000.
  • Support on ASP.NET bug Fix.
  • Unit testing.

Tools: SSRS, ASP.NET, SQL Server 2005

We'd love your feedback!