We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

2.00/5 (Submit Your Rating)

Tempe, AZ

PROFESSIONAL SUMMARY:

  • Around 6 years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9.x/8.x/ 7.1.3/7.1.1/6.2. Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
  • Experience in Software Analysis, Design and Development for various software applications in client - server environment in providing Business Intelligence Solutions in Data Warehousing for Decision Support Systems, OLAP and OLTP Application Development .
  • Experience in Web Application Development using Python, Django, Java, HTML, and Oracle.
  • Experience working with Power Center Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Experience in dealing with various data sources like Netezza, Oracle, SQL Server 2012/2010/2008 , Teradata, Flat files, web services, XML, COBOL/VSAM files.
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Experience in SQL performance tuning.
  • Thorough Knowledge in creating DDL, DML and Transaction queries in SQL for Oracle database.
  • Extensively worked with Informatica performance tuning involving source level, target level and mapping level bottlenecks.
  • Extensive experience with Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Normalizer, Union and XML Source Qualifier.
  • Experience in scheduling of ETL jobs using Control-M.
  • Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation.
  • Experience in all phase of SDLC like Requirement Analysis, Implementation and Maintenance, and extensive experience with Agile Methodology.
  • Expertise in RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Cursors, Stored Procedures, Functions and Triggers.
  • Experience in working with Perl Scripts for handling data coming in Flat files.
  • Experience working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files.
  • Performed data validation by Unit testing, integration testing and System Testing.
  • Extensive experience in managing teams/On Shore-Offshore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
  • Good knowledge on Python Programming and used in current project for good views to reports.
  • Knowledge on Machine Learning and Deep Learning concepts.

TECHNICAL SKILLS:

Languages: Java, Python, SQL, PL/SQL, T-SQL(Transact-SQL), MySQL, UNIX Scripting

Tools: and Utilities: Informatica Power Center 9.6.1/8.6.1/7.1.3 , Power Exchange 9.6/9.5/9.1/8.6 , Informatica IDQData Replication, SSIS, Rule Point.

Databases: Oracle 11g/10g/9i/8i, Teradata V2R6/V2R5, SQL Server 2008/2012.

Operating Systems: UNIX, Linux.

Reporting Tools: OBIEE 11g/10g, Tableau.

Others: HP quality center V.11, Toad, ALM, Data Modeling Tools Erwin 4.x, Toad, SCM, Putty, Informatica MDM, Winscp.

PROFESSIONAL EXPERIENCE:

Confidential, Tempe, AZ

ETL/Informatica Developer

Responsibilities:

  • Worked in various phases of Software Development Life Cycle (SDLC) such as requirements gathering, analysis.
  • Day to day interactions with Data Modelers and Business Analyst to understand the requirements and the impact of the ETL on the business.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Extensively used Informatica Client tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Extracted data from various heterogeneous sources like Oracle, Netezza, and Flat Files.
  • Extracting data from Netezza, Oracle and Flat files, Excel files and performed complex joiner, Expression, Aggregator, Lookup, Stored procedure, Filter, Router transformation, Update strategy transformations to load data into the target systems.
  • Used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Created Mapplets and used them in different Mappings.
  • Involved in developing the Deployment groups for deploying the code between various environment (Dev, QA, and Production).
  • Created Pre SQL and Post SQL scripts which need to be run at Informatica level.
  • Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistant.
  • Optimized SQL queries for better performance.
  • Worked with PL / SQL to create new stored procedures and modify the already existing procedures as per the change requirements from users.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Maintained the Defects Tracking in HP Quality Center and to analyze the variation between the expected and actual results.

Environment: Informatica Power center 9.6.1, Oracle 11/10g, SQL server 1.5.5, SSRS, PL/SQL, IDQ, LSF (Job scheduling), Teradata, UNIX shell scripting, Application Lifecycle Management (ALM) for defect tracking.

Confidential, San Francisco, CA

Informatica Developer

Responsibilities:

  • Data Analysis, Profiling & Requirement Analysis
  • Learned SAS DI to extract the logic from SAS and convert it to Informatica
  • Converted DataFlux Rules/Jobs into informatica jobs
  • Developed informatica mappings to extract data from various Sources to load into Stage. Created various checks for data prior load
  • Design, Develop, test and Implement the jobs that extract, transform and load data into the Netezza.
  • Provided Support to applications and infrastructure of production integration environment, which includes performance tuning, troubleshooting and maintenance of integration platform.
  • Created multiple views& redesigned existing Views to provide better performance.
  • Worked on Informatica Data Quality, Applying DQ rule and filtering data as well as checking the data quality.
  • Implementing these rules for particular columns and filter the data based on these condition.
  • Viewing these results in Dashboards for quality purpose and using OBIEE to generate reports.

Environment: Informatica Power center 9.6.1, Oracle 11/10g, SQL server 1.5.5, PL/SQL, IDQ, UNIX shell scripting, Application Lifecycle Management (ALM) for defect tracking.

Confidential

Informatic Developer

Responsibilities:

  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
  • Actively involved in the Design and development of the STAR schema data model
  • Extensively worked with Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, etc.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
  • Used SQL tools like SLQ Server Management Studio to run SQL queries and validate the data in warehouse and mart.
  • Developed Informatica mappings, sessions, Workflows for data loads and automated data loads. Created and used mapplet, worklet, shared objects
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.
  • Successfully migrated objects to the QA & production environment while providing both technical and functional support.
  • Perform UNIT/INTEGRATION testing and UAT data setup for the client validation
  • Performed data validation, unit testing and took part the SIT & UAT.
  • Using oracle hints for better performance on SQL override queries
  • Created Informatica PowerCenter mappings to create various Type 1 and Type 2 mappings using transformations like Joiner, Lookup, Router, Filter, Union, Sequence Generator, Aggregator and Expression
  • Worked on defects on resolving them using Quality Center.

Environment: Informatica Power Center 7.1.3, Oracle 10g, UNIX, Business Objects, PL/SQL, Flat files, SSIS, MS SQL Server 2008.

We'd love your feedback!