We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

4.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY:

  • 7+ years of experience in various phases of projects in Developing, Enhancing and Maintaining applications in industry verticals like Retail, Insurance, Banking, pharmaceutical, Insurance and Health Care domain using Informatica Power Center.
  • 7+ years of experience in using Informatica Power Center (10.1/9.6/9.1/8.6/8.1 ).
  • One year of experience in IICS Intelligent Informatica Cloud Services (IICS).
  • Migration of Informatica Jobs from Dev, QA to Prod environments using Repository Manager.
  • Involved in INFA Admin repository upgrade Activities and INFA Admin Support Activities.
  • Proficiency in developing SQL with various relational databases like Netezza, Oracle, Teradata, Toad and SQL Server.
  • Extensively involved in Query Level Performance tuning using Optimizer Hints, partitioning tables and Indexes.
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Experience in all the phases of Data warehouse life cycle involving requirements gathering/analysis, design, development, validation & testing of Data warehouses using ETL, Data Modelling & Reporting tools .
  • Good working knowledge in querying Salesforce.com database using SOQL & SOSL queries using Force.com Explorer & Workbench.
  • Used Informatica Power Connect for SAP to pull data from SAP R/3 .
  • Exposure and experience in different methodologies like Agile and Waterfall.
  • Used to work with in Sprints and deliver the efficient code.
  • Performed Informatica upgrade from V8.6.1 to 9.0.1, 9.1.0 and 9.6.1.
  • Creation and maintenance of Informatica users and privileges
  • Experience in Dimnesional Data Modeling Star Schema, Snow - Flake Schema, Fact and Dimensional Tables.
  • Expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Expedite in writing UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Developed UNIX scripts for dynamic generation of Files & for FTP/SFTP transmission.
  • Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Proficient in Test execution and Bug tracking.
  • Hands on experience to conduct scrum meetings and sprint planning by using tools like JIRA, Confluence & HP-ALM to track the progress of projects.
  • Have Good experience using scheduler tools like Tidal , Control-M and Autosys.
  • Ability to meet deadlines and handle multiple tasks , work under both Agile and Waterfall methodologies.
  • Experienced in identifying Business requirements, Technical requirements, Database Design, Troubleshooting and writing interfaces between various applications.
  • Excellent leadership skills to work in team, analytical skills, logical reasoning, interpersonal skills and attitude to lead. Excellent communications skills, fast & enthusiastic learner, excellent work ethics and a good team player.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.1.0/9.6.1/9.5/9.0/8.6/8.1 AgnityAMP, Informatica Intelligent Cloud Service (IICS)

Hive HUE 2.6.1: 2

Database Tools: Toad, Oracle 11g/10g/9i, Microsoft SQL-Server 2000/2005/2008 , IBM-Netezza (Aginity Work bench), DB2, Teradata, SQL * Loader, SQL * Plus.

Control: M, Autosys and Tidal.

Internet Technologies and Microsoft tools: HTML, XML, MS office tools (Outlook, Word, Excel, Visio, Power point, Publisher)

Version Control Tools: TFS, VSTS and Git Hub.

Programming Languages: PL/SQL, Unix, Shell Scripting.

File Transfer Tools: FTP, Putty, and FileZilla

OS and Other Tools: Windows All versions, Skype, JIRA, Confluence.

PROFESSIONAL EXPERIENCE:

Confidential, Tampa, FL

ETL/Informatica Developer

Responsibilities:

  • Design and develop quality integration workflows between multiple applications utilizing Informatica tools.
  • Analyzed the business requirements and functional specifications.
  • Participated in system analysis and data modeling, which included creating tables, views, indexes, synonyms, triggers, functions, procedures, cursors and packages.
  • Created mappings, mapping configuration tasks and task flows with Intelligent Informatica Cloud (IICS) and Infomatica Power Center (10.1.1 & 9.6.1)
  • Provide production support and solve complex integrations issues.
  • Worked in Production Support Environment as well as QA /TEST environments using Quality Center tool for projects, work orders, maintenance requests, bug fixes, enhancements, data changes.
  • Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
  • Used TFS, VSTS and GIT for version control.
  • Work closely with the teams to ensure architectural integrity related to integrations activities.
  • Collaborate with business stakeholders for UAT and Quality Assurance for unit/technical testing.
  • Provide leadership and technical guidance within the integrations and specifically Informatica landscape.
  • Work alongside team members to architect, design and develop quality deliverables.
  • Built mappings using Informatica Data Quality and export them as Mapplet to Informatica Power Center to read JSON format files.
  • Created various workflows to read .JSON files and loaded into staging tables and then to respective Dimension and Fact tables.
  • Created QA and PROD change requests using Service Now too to move the code to QA and PROD respectively.
  • Drive results and leverage past experiences to improve the environment and related processes.
  • Foster a spirit of teamwork, coach and mentor junior staff.
  • Communicate regularly with other managers, the director, line of business leads and other designated contacts within the organization.
  • Report on the status of development, quality, operations, and system performance to management.
  • Troubleshooting, diagnostics & performance tuning in database cost-based optimizer mode.
  • Enhanced the performance of the procedures by creating suitable global and local indexes.

Environment: Informatica Power Center 10.1.1/9.6.1 , Intelligent Informatica Cloud Services (IICS), Toad, Oracle SQL Developer, SQL Server Management Studio(SSMS), JIRA, Service Now, Visual Studio Team Services(VSTS), GIT HUB.

Confidential, St. Petersburg, FL

ETL/Informatica Developer

Responsibilities:

  • We use Informatica 9.6.1 to load the data from Legacy tables to Netezza tables.
  • All the tasks had been assigned in the form of tickets and used JIRA to track them.
  • Our primary source is vault which need to be pulled from HIVE and load into our landing area.
  • Delivered all the assigned tasks on time, Work had been divided into sprints which is for two weeks.
  • Attended all the business requirement meeting along with my lead and played a crucial role in building the requirements.
  • Worked intensively with our team and build easily accessible code without missing the standards of the client.
  • Prepared 301 business documents which consist of all the data flow form source to target.
  • Built Data Integration Components using Informatica power center following locally DI Framework and recipes (i.e ETL Cookbook) like Netezza, Oracle, sql server on Unix/Linux and Windows operating systems.
  • Extracted data from our cloud sources (Hive HUE). HDFS system would dump the files into our cloud known as kafka que which is called as Hive Prod HUE.
  • Used Hue version- 2.6.1-2 Web interface to query the data.
  • Used Aginitiy AMP to generate Natural keys by the combination of multiple foreign keys.
  • Wrote complex SQL overrides to join heterogenous databases.
  • Worked on Writing and tuning complex SQL queries and shell scripts.
  • Was involved in ETL production support once in every 3 weeks and used to work on issues which had been assigned.
  • Perform Unit testing for the code built.
  • Closely worked with Quality Analyst to make sure the code meets the required standards and the data is valid.
  • Used Autosys to schedule the jobs by creating JIL scripts.
  • Actively participated in code migration with DBA and admins.
  • Maintain all documents current and available in the SharePoint.

Environment: Informatica 9.6.1, HUE-2.6.1-2, Aginity AMP, Hive, Oracle, Netezza, Autosys, JIRA

Confidential, Phoenix, AZ

ETL/Informatica Developer

Responsibilities:

  • Informatica 9.5 been used to load the data from Oracle to Netezza tables.
  • Used Jira and confluence as project tacking tool, where we used to have all the tasks built and updated.
  • Confluence has used to upload all the documents and decision tasks in confluence.
  • Played a main role in creating all the templates to create different status pages add our tasks and track them in Jira.
  • As this project falls under Waterfall methodology, still we used to have daily scrum meetings and I took lead to conduct this daily standup meeting.
  • As this project, has very small dead line and has a high priority because the contract is going to be end by Dec-2016 with E-bay, worked hard as a team by not letting the work goes into risk.
  • Prepared all the designing documents consist of Data flow from Source to Target by mapping all the columns, which are required for reporting purpose.
  • Handled the complex mappings by modifying some of the core tables which consist of Confidential customer data and also the sales tables that are involved in Batch load.
  • Created different (Detailed and High level) Data Model diagrams to explain the flow of the data by using Data Modular tool called ERwin.
  • Extensively worked understanding the Business requirements and designed the logic to populate the data as expected.
  • Created DDL and DML scripts that have structure of new tables, and modifications of existing tables.
  • Built Mappings, work lets and workflows to load the data into staging area, and then into DW tables.
  • Used Push Down Optimization to increase the Performance.
  • Created Tidal jobs for automation of work Flows.
  • Took Responsibility of creating the implementation plan document and worked closely with the admin’s during go live.
  • Provided One-month warranty support, which is a regular process in Confidential .

Environment: Informatica 9.1/9.5, SQL Server, Oracle, Netezza, Tidal and JIRA.

Confidential, Warren, NJ

ETL/Informatica Developer

Responsibilities:

  • ETL / Data Warehouse experience, preferably with SQL Server/TSQL.
  • Excellent knowledge on Informatica Power Center development.
  • Created various shell scripts using UNIX.
  • Worked on Kimball Methodology, Star Schemas and slowly changing dimensions.
  • Worked in a multi-tasking team environment supporting multiple in-flight development projects at the same time.
  • Proficient in SDLC methodologies.
  • Extracted data high volume of data sets from SalesForce.com(SFDC) using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Warehouse.
  • Involved in querying Salesforce tables using SOQL queries using Force.com Explorer.
  • Have good knowledge SOQL and Sales force governor limits.
  • Worked on SQL Server including DDL, TSQL, Stored Procedures, etc. in large-scale relational databases.
  • Excellent project management abilities, worked within a large project plan and able to manage own tasks and time.
  • Worked on SQL server, Oracle, Sybase databases.
  • Used UNIX scripts for scheduling and executing Informatica workflows.
  • Worked on scheduler tools like Autosys and Control-M.
  • Designed Work Flows that uses multiple sessions and command line objects (which are used to run the UNIX scripts).
  • Created UNIX scripts for FTP and SFTP files to different servers.
  • Used Informatica file watch events to pole the FTP sites for the external mainframe files.
  • Designed the ETL specification documents to gather existing workflows information from different ETL teams and shared with Integration and production maintenance team.
  • Designed the ETL runs performance tracking sheet in different phases of the project and shared with Production team.
  • Interacted with developers and product owners regularly.
  • Involved in Functional, Positive and Negative testing of the product.
  • Documented and reported defects within established process and tracking systems using ALM.
  • Prepared the validation report queries, executed after every ETL runs, and shared the resultant values with Business users in different phases of the project.

Environment: Informatica 9.6.1, SQL Server 2008, Oracle, Teradata, Salesforce, Autosys, Shell Scripting, FTP/SFTP.

Confidential, Bentonville, AR

ETL/Informatica Developer

Responsibilities:

  • Worked with Business Analyst and Analyzed specifications and identified source data needs to be moved to data warehouse, participated in the Design Team and user requirement gathering meetings.
  • Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.
  • Created ETL mappings and mapplets to extract data fromERPs like SAP, ENTERPRISE 1 (Oracle) and load into EDW (Oracle 10g).
  • Fixing all the workflows failure in unit testing and system testing.
  • Performed Informatica upgrade from V9.1 to 9.5.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
  • Documented the LDAP configuration process and worked closely with Informatica Technical support on some of the issues.
  • Identified Multiple Dimensions and Fact tables. Used advance data modeling concepts of Degenerated dimension, sub-dimension, Fact less fact table, Aggregate fact tables in Multidimensional model.
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
  • Scheduling all the ETL workflows for the parallel run comparison.
  • Scheduled batch jobs using Control-M to run the workflows.
  • Involved in preparing the migration list inventory.
  • Involved in requirement gathering for redesign candidates.
  • Used FTP services to retrieve Flat Files from the external sources and RDBMS sources Teradata and Oracle.
  • Written several shell scripts using UNIX Korn shell for file transfers, error log creations and log file cleanup process.
  • Worked along with the Informatica professional to resolved Informatica upgrade issue.
  • Worked with BA in the QA phase of testing.
  • Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems.
  • Used workflow monitor to monitor the jobs, reviewed error logs that were generated for each session, and rectified any cause of failure.

Environment: Informatica Power Center 9.1/8.6, Oracle 11g, PL/SQL, SAP R/3, Control-M, SQL, Teradata, SQL* LOADER, TOAD, Shell Scripting, UNIX, FTP/SFTP.

Confidential, Woodlands, CA

ETL / Informatica Developer

Responsibilities:

  • Involved in gathering requirements from end users and involved in modifying data according to the requirement.
  • Worked with the end users to identify, create, and deliver reports according to requirements.
  • Interacted with business users and source system (OLTP) IT teams to define, agree and document incoming data mapping specifications.
  • Extensively involved in ETL code using Informatica tool in order to meet requirements for extract, cleansing, transformation and loading of data from source to target data structures.
  • Created ETL Mappings, Sessions and Workflows using various transformations like Update Transformation, Lookups, Filters, Routers and XML etc.
  • Worked on various issues on existing Informatica Mappings to produce correct output and used ETL debugger extensively to identify the performance bottlenecks within the mappings.
  • Actively involved in creating Oracle Stored Procedures, Packages, Functions, Triggers, Records, Arrays and Exception handling using TOAD, SQL*PLUS and PL/SQL.
  • Created Fast Load, Multi Load, and T Pump and BTEQ scripts to load data into Teradata.
  • Created Data models for data warehouse development.
  • Involved in identifying data discrepancies and data quality issues and worked to ensure data consistency and integrity.
  • Worked with UNIX scripts for automation of ETL Jobs using Control-M Scheduler and Involved in migration/conversion of ETL processes from development to production environment.
  • Working experience using Push Down Optimization.
  • Did performance tune of Informatica components for daily and monthly incremental loading.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica work flow manager.
  • Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Environment: Informatica Power Center 9.1, IDQ, Oracle11gR2, DB2, SQL, PL/SQL, TERA DATA, UNIX shell scripting, Control-M and, TOAD 10.5.

Confidential, East Hanover, NJ

Informatica Developer

Responsibilities:

  • Constantly interacted with business users to discuss requirements.
  • Involved in the analysis of the user requirements and identifying the sources.
  • Created technical specification documents based on the requirements by using S2T Documents.
  • Involved in the preparation of High level design documents and Low level design documents.
  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Followed Ralph Kimball approach (Bottom up Data Warehouse Methodology in which individual data marts like Shipment Data Mart, Job order Cost Mart, Net Contribution Mart, Detention & Demurrage Mart are providing the views into organizational data and later combined into Management Information System (MIS)).
  • Prepared Level 2 Update plan to assign work to team members. This plan is very helpful to know the status of each task.
  • Administered the repository by creating folders and logins for the group members and assigning necessary privileges.
  • Designed and developed Informatica’s Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.
  • Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Developed reusable Mapplets and Transformations.
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Involved in monitoring the workflows and in optimizing the load times.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Involved in writing procedures, functions in PL/SQL.
  • Developed mappings in Informatica using BAPI and ABAP function calls in SAP.
  • Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in)
  • Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Rational Clear Quest does defect Tracking and reports.

Environment: Informatica Power Center 9.1, Business Intelligence, Informatica Power Exchange, SQL, Oracle11gR2, DB2, Erwin, Control M and TOAD 10.5.

Confidential

Informatica developer

Responsibilities:

  • Designed and Developed mappings using different transformations like Source Qualifier, Expression, Lookup (Connected & Unconnected), Aggregator, Router, Rank, Filter and Sequence Generator.
  • Created Update Strategy and Stored Procedure transformations to populate targets based on business requirements.
  • Extensively used PL/SQL programming procedures, packages to implement business rules.
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Workflow Manager.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Developed Procedures and Functions in PL/SQL for ETL.
  • Extensively used ETL to load data from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle.
  • Used Update strategy and Target load plans to load data into Type-2 /Type1 Dimensions.
  • Created and used reusable Mapplets and transformations.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Design and Developed pre-session, post-session routines and batch execution routines.
  • Debugging and Trouble Shooting of Mappings.

Environment: TOAD, Informatica Power Center 8.1, Oracle 9.1 .

We'd love your feedback!