We provide IT Staff Augmentation Services!

Senior Etl Informatica Developer Resume

4.00/5 (Submit Your Rating)

Torrance, CA

SUMMARY

  • Over 8+years of overall experience in IT Professional experience with versatile, cross platform experience in different operating systems, implementation, Integration and implementation of business applications.
  • Informatica Power Center 9.6/ 9.1/8.6/8.5/8.1/7.1 Oracle 11g/10g/9i(SQL/PL SQL).Extensive experience in System Analysis, Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications inPharmaceutical, Finance and Insuranceindustries.
  • Experience in various phases of the Software Development Life Cycle including gathering requirement, Analysis, Design, Develop and Testing.
  • Experience with Software Development Life Cycle (SDLC) and agile methodology.
  • Experience in the Analysis, Design, Development, Testing and Implementation phases of Business Intelligence solutions using ETL mechanisms like Informatica PowerCenter (10.2/9.6.1/8.6 )) (Repository Manager, Designer, Workflow Manager, and Workflow Monitor) and Informatica on Cloud.
  • Working noledge of data warehouse techniques and practices, experience including ETL processes, dimensional data model and relational data models (Star Schema, Snowflake Schema, FACT and Dimension Tables), OLTP, and OLAP.
  • Experienced working in Agile Teams and familiar with Scrum Roles, Scrum Ceremonies and Scrum Artifacts.
  • Experience in IT industry working with tools Informatica Data Quality, Informatica Data Analyst and Informatica Power Center.
  • Worked as Interim Scrum master facilitating the ceremonies and halped team to remove impediments and achieve Sprint goals.
  • Worked closely with the data scientists for migrating the prediction algorithms/models to Python sciKit - learn API from R-studio and also Involved in the feature selection for creating the prediction models.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Extensive experience inRelational Data Modeling, Dimensional Data Modeling, Logical/Physical Design, ER Diagrams, Forward and Reverse Engineering, Publishing Erwin diagrams, analyzing datasources and creating interface documents.
  • Experienced in providing Tier 2 and 3 Production support to Autosys batch processes and perform trouble shooting using power center logs, LINUX/UNIX scripts and DB processes
  • Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data stage.
  • Excellent comfort level in attending Daily Scrum Stand up, Sprint planning, Sprint Review and Sprint Retrospective Meetings.
  • Experienced in Installation, Configuration, and Administration of Informatica Data Quality and Informatica Data Analyst.
  • Familiar with the AWS cloud services like EC2, Elastic Container Service (ECS), Simple Storage Service (S3) and Elastic MapReduce (EMR).
  • Expertise in working with relational databases such as Oracle 12c/11g/10g/9i/, SQL Server, and Teradata.
  • Experience in Design and Development of ETL methodology for supporting Data Migration, data transformations & processing in a corporate wide ETL Solution using Teradata TD 14.0/13.0/12.0.
  • Database experience using Oracle, SQL Server, Azure SQL Server, SAP HANA, Teradata, DB2 and MS Access.
  • Experience working with Informatica tool TEMPeffectively using it for Data Integration and Data Migration from multiple source systems in Azure Sql Data warehouse.
  • Experience in Informatica Power Centre Designer Components - Source Analyzer, Target Designer, Transformation developer, Mapplet and Mapping Designer.
  • Proficient in the development of Extract, Transform and Load processes with a good understanding of source to target data mapping, ability to define and capture Meta data and Business rules.
  • Experience in Data Analysis, Data Mapping, Data Modelling, Data Profiling, and development of Databases for business applications and Data warehouse environments.
  • Have configured theInformatica BDM/BDQ/EIC/IDL 10.1/10.2.1 and integrated with theKerberosenabledCloudera and Horton HadoopFrameworks.
  • Experience in creating Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Knowledge and Experience on data processing phases, from the Enterprise Model, Data Model (Logical and Physical Model), and Data Warehousing (ETL).
  • Use the default data domains groups (IDQ/BDQProduct Content) and create any customized rules to make sure all the scenarios are covered and follow up with data stewards to update the curation status based on which the access levels will be decided for the groups.
  • Skillful experience in functional programing usingScala and Python.
  • Responsible for producing Credit Risk Legal entity reports for EMEA dat are submitted to Senior Management
  • Worked on different data formats such as JSON, XML and performed machine learning algorithms in Python.
  • Experience in using Informatica PowerCenter for Extraction, Transformation and Loading of data using heterogeneous source and target systems like Flat files (fixed width and delimited), Salesforce, Oracle.
  • Experienced in writing Stored Procedures, Package, Functions, Triggers, Views and Materialized Views using SQL and PL/SQL.
  • Knowledge of accounting and financial reporting. Regulatory reporting noledge a plus
  • Extensive ETL tool experience usingIBM Infosphere/Websphere DataStage, Ascential DataStage.
  • Proficiency in Data Warehousing techniques for Data Cleaning, Slowly Changing Dimension (SCD) phenomenon, Surrogate Key assignment and Change Data Capture (CDC).
  • Experienced in scheduling Sequence and parallel jobs usingDataStage Director, UNIX scriptsandscheduling tools.
  • Experience in in supporting and performing Unit testing, System Integration testing, UAT and production support for issues raised by application users.
  • Experience with Integration of data from heterogeneous sources such as Relational tables, flat files, MS Excel and XML files.
  • Experience in developing Test Plans, Test Strategies and Test Cases for Data Warehousing projects ensuring the data meets the business requirements.
  • 2 years of experience in Business Intelligence reporting tool OBIEE (Oracle Business Intelligence Enterprise Edition)
  • Excellent Interpersonal and communication skills and is experience in working with senior level managers, business, and developers across multiple teams.
  • Attended 4 days of Agile training sessions on Agile Methodology and various Agile Frameworks.

TECHNICAL SKILLS

ETL SKILLS: Informatica Power Center 10.4.1/9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server)

Business Intelligence: SQL Server Reporting Services(SSRS) 2014/2012/2008/2005 , SQL Server Analysis Services (SSAS) 2014/2012/2008/2005 .

Data Bases: Oracle 11g/10g/9i/8i/8.0, DB2, MY SQL, Teradata, Access.

Data Modeling tools: Erwin, MS Visio, Star Schema Modeling, Snow Flake Modeling

Scheduling tools: Control M, Crontab, Tidal, DAC

Cloud Technologies: AWS, Azure Cloud

Defect Management: HP Quality Centre, HP ALM, JIRA

Languages: SQL, PL/SQL, TSQL, C++, Python, UNIX Shell Scripting. C/C++/C#, Perl, XML, HTML, VB.NET

OLAP Tools: Cognos 8.0/8.1/8.2/8.4/7.0/ , Business Objects XI r2/6.x/5.x, OBIEE 10.1.3.4/ 10.1.3.3

Testing Tools: QTP, Win Runner, Load Runner, Quality Center, Test Director

Methodologies: Agile, Waterfall, snowflake Schema

Operating Systems: UNIX, Windows 2008/2007/2005/ NT/XP, MS-DOS, Solaris

PROFESSIONAL EXPERIENCE

Confidential, Torrance, CA

Senior ETL Informatica Developer

Responsibilities:

  • As a consultant studied the existing DataMarts to understand and integrate the new source of data.
  • Managing the off shore support group in India for support issue as well as small enhancements for data warehouse.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business
  • Designed and created new Informatica jobs to implement new business logic into the existing process
  • Using Informatica modules (Repository Manager, Designer, Workflow Manager and Workflow
  • Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Developed data models according to company standards. Used DB2 Connect to extract multiple database source systems to single analytical system for large databases and also24*7 Production support for data warehousing
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Used all the complex functionality of informatica (Mapplets, Stored Procedures, Normalizer,
  • Update Strategy, Router, Joiner, Java, SQL Transformation Etc...) to interpret the business logic into
  • Work within a development team to develop application programs dat work with web and UNIX technologies using iWay Service Manager, Java, servlets, XML, HTML, JSP's, JavaScript, Web Services, Master Data Management Technology, and Mainframe/Host Technology Integration
  • Experience working with Azure Sql Data Warehouse integration and various Native (v2 and v3 connectors) and Microsoft connector with PDO support for Azure Sql.
  • Good Understanding about Azure Sql Dwh concepts relating to storage, distribution, DWU units, resource user groups, connection strings etc
  • Developed a Web service on the Postgres database using python Flask framework which was served as a backend for the real-time dashboard.
  • Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and resolve the issues as part of production support
  • Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and customers and resolve the issues as part of production support
  • Provided production support to various manuals and deployment.
  • Experience working with various Azure Sql data warehouse connectors including V2 and V3 connectors.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Used Informatica’s features to implement Type 1, 2 changes in slowly changing dimension, Change Data Capture (CDC)
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Technologies used: WhereScape RED & 3D, MS Azure, SSRS, SSIS, Power BI, TFS, Python, Azure Data factory, Tableau, Informatica, Snowflake, Shell scripting, Oracle, SolarWinds, REST and MS Intune API
  • Extensively used DataStage for extracting, transforming and loading databases from sources includingOracle,DB2and Flat files.
  • Wrote Unix scripts, perl scripts for the business needs.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process
  • Extensive experience delivering Data warehousing implementations, Data migration and ETL processes to integrate data across multiple sources using Informatica PowerCenter and Informatica Cloud Services
  • Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used debugger to test the mapping and fixed the bugs.
  • Extensively used DataStage Change Data Capture for DB2 and Oracle files and employed change capture stage in parallel jobs.
  • DevelopedSparkscripts usingScalaas per the requirement usingSpark 1.5 framework.
  • Developed Scala Scripts,UDFsusing bothData frames/SQLandRDDin Spark for data aggregation, queries and writing data back onto HDFS.
  • Conducted Design and Code reviews and extensive documentation of standards, best practices and
  • ETL Procedures.
  • Experienced in writing Stored Procedures, Package, Functions, Triggers, Views and Materialized Views using SQL and PL/SQL.
  • Attained experience in an end to end implementation during build of EIM Organization Model and integrated conceptual data model
  • DevelopedFast Loadjobs to load data from various data sources and legacy systems toTeradata Staging.
  • UsedBTEQandSQL Assistant(Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Exposed to various other technologies like Business Objects, MDM.
  • Supporting production issues with detection and solution of job failures due to any unprecedented cause.
  • Created shell script to run datastage jobs from UNIX and thenschedule this script to run data stage jobs through scheduling tool likeAutosys and Tivoli.
  • Extensive noledge on linking data from multiple sources, using functionalities like combined queries, drill down, and master detail usingBusiness Objects.
  • Involved in analyzing legacy sources to figure out data quality issues.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Importing of Talend jobs using Talend Admin Console.
  • Performing data manipulations using various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, and Update Strategy.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Handled the performance tuning of Informatica Mappings at various level to accomplish the established standard throughput.
  • Analysed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Worked on Migrating and upgrading the Informatica PC 9.5.1 application on windows to 10.2 on Red hat Linux which involved bringing up a parallel environment and restore the domain from the old server onto new one and migrate the repositories.
  • Containerized and Deployed the ETL and REST services on AWS ECS through the CI/CD Jenkins pipe.
  • Conducted unit testing of all ETL mappings as well as halped QA team in conducting their testing.
  • Enforced referential integrity in theOLTP data modelfor consistent relationship between tables and efficient database design.

Environment: Informatica PowerCenter 10.x/9.x, Informatica PowerExchange 10.1, Informatica BDM 10.1.0, IDQ, IICS R29, Tableau10.0, Oracle 10g / 11g/12c, Linux, SQL, PL/SQL, SQL*Loader, CVS, TOAD, UNIX,SSIS, Shell TOAD, Snowflake IICS Data Warehouse, PostgreSQL Teradata 15, AWS S3 Bucket, SQL Server 2012, ORACLE 12c, Control M, Shell Scripting, JSON, SQL Loader, Windows, UNIX.

Confidential, Nashville, Tennessee

Sr. ETL Informatica Developer

Responsibilities:

  • Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Developed the audit activity for all the IICS mappings.
  • Automated/Scheduled the IICS jobs to run daily with email notifications for any failures.
  • Created applications connections like connection for database, Salesforce service cloud, salesforce sales cloud salesforce financial service cloud and many portals.
  • Data integration with SFDC and Microsoft Dynamics CRM using Informatica cloud(IICS).
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Understand the complex Java/Python scripts to write the mapping rules.
  • Created UNIX shell scripting for automation of ETL processes.
  • Understand the complex PL/SQL queries, Python scripts and developed Ab initio plans/graphs to replace the existing jobs with Ab-Initio code.
  • Conducted rigorous regression testing to verify the new code does not missing and existing functionality of the Java/Python code.
  • Responsibilities includes end to end activities such as mapping specifications, development, testing, production support and deployment.
  • Performed Historical loads while loading the source table data to Data Warehouse, by maintain data versions.
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud (IICS).
  • Developed the audit activity for all the cloud mappings
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Performed SQL and PL/SQL tuning using tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE
  • Extensively used Oracle Hints to direct the optimizer to choose an optimum query Execution Plan
  • Extensively used Bulk Collection in PL/SQL Objects for improving the performance
  • Handled errors using Exception Handling extensively for debugging and maintainability.
  • Automated Oracle and Informatica workflows using Unix Cron Utility in Unix Environment.
  • Responsible for writing Unix Shell scripts for loading data using SQL*Loader. The Control Files for the tables were created and automated through UNIX shell scripts to perform data load into Oraclables. Used SQL Loader and PL/SQL scripts to load data into the system application.
  • Performed Data Cleansing to weed out erroneous Orders from the DWH tables
  • Extensively used Toad to develop Oracle PL/SQL Packages.
  • Create Procedures and Functions in SQL Server to migrate data from Legacy servers to Oracle Datawarehouse tables
  • Create Informatica workflows (7.0 for Flat Roll/9.0 for Tubular implementations) for implementing business logic
  • Worked on Sessions & workflows for variety of loads starting from Source to Target.
  • Created Parameter Files to override session properties for source directory, Bad file Directory, Source, Target & created Mapplet parameters for Mapplets.
  • Created Mappings using Designer, Extracted the data from the flat files and other RDBMS databases into staging area and loaded onto Data Warehouse.
  • Developed the Informatica Mappings by using Aggregator, SQL Overrides by using Lookups, source filter by using Source Qualifiers and Data Flow Management into multiple targets using Router Transformation.
  • Developed reusable Workflows, Worklets, Mappings and Mapplets using transformations including Router, Aggregator, Lookup, Filter, Expression, Sequence Generator, Update Strategy, Joiner and Union
  • Create Shell scripts for automating the business logic
  • Create ABPP (Agile Business Process Platform) workflows which sequences the execution of Oracle Packages, Informatica workflows, Shell scripts as per the business logic
  • Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs
  • All the QAT test cases were sent to business UAT team for the power user approvals.
  • Validated the data loaded using the internal navigator tools for maintaining integrity of data.
  • Designs, develops and tests applications using Informatica PowerCenter, Informatica Data Quality, and Informatica Master Data Management to meet functional specifications
  • Create data mapping and workflow using Informatica PowerCenter to extract, transform and load data into the target reporting environment
  • Worked on Informatica Scheduler to schedule daily, weekly, monthly and quarterly jobs.
  • Followed Agile Scrum model for managing the daily responsibilities and tasks.
  • Used UNIX for check in's and checks outs of workflows and config files in to the Clearcase.
  • Improving the performance of the ETL by indexing and caching.
  • Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica PowerCenter.
  • Develop, maintain, and administer complex ETL processes using IBM Data Stage
  • Build new ETL processes to make new and existing data sources readily available to the business using IBM Data Stage
  • Document designs and architect data maps, develop data quality components and establish and/or conduct unit tests
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Involved in writing SQL Stored procedures and Shell Scripts to access data from various sources.
  • Assisted other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
  • Imported various heterogeneous files using Informatica PowerCenter 9.6 Source Analyzer.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Created XML, Autosys JIL for the developed workflows.
  • Migration of code from development to Test and upon validation to Pre-Production and Production environments.

Environment: Informatica PowerCenter 10.1/9.6, Informatica PowerExchange 10.4.1/9.6,: Oracle 10g / 11g/12c, SQL, PL/SQL, Data Quality, IICS, SQL Server 2008, Shell Scripts, Teradata 14, SQL, UNIX, Toad, SQL Developer, HP Quality Center, Cognos 9, T-SQL,MS Access, Windows.

Confidential, Irving, TX

Sr. ETL Informatica Developer

Responsibilities:

  • Analyzing the Functional Specifications and changing necessary modifications required through onsite team interaction.
  • Understanding/analyzing the requirements& providing solutions for issues related with same which covers activities like: requirement analysis, ETL designing, scheduling workflows and providing Maintenance, Enhancement & Support to the Client.
  • Prepare the ETL Specifications for business logic’s .
  • Involved in Dimensional modelling (Star Schema) of the Data warehouse and designed dimensions, facts based on user requirement
  • Designed the Data Warehouse/ ETL processes using Informatica Power Center 9.6 to extract, transform and load data from multiple input sources like SQL Server, flatfiles and Oracle to target Teradata database
  • Created temporary database (preparation DB) to load preparation data.
  • Created complex mappings using various transformations such as Rank, Joiner, Expression, Lookup (Connected/Unconnected), Aggregate, Filter
  • Created BTEQ files with set of complex SQL queries and stored procedures, triggers( PL/SQL)
  • Coordinate with application development team to achieve success in design, development and deployment activities
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Timer and scheduling of the workflow.
  • Performed tuning of SQL queries for speedy extraction of data toresolve and troubleshoot issues in OLTP environment.
  • Involved in tuning of mappings and sessions for better performance
  • Modified existing mappings for enhancements of new business requirements.
  • Created mapplets, mappings to ensure data is upto standards in IDQ developer
  • Performed various data quality checks to ensure accuracy, correctness, data consistency, data completeness in prod environment
  • Prepared migration document to move the mappings from development to testing
  • Prepared mapping document to explain flow of data from source to target.
  • Generated web intelligence reports using BOXIR3 on ad-hoc basis
  • Extensively involved with backend testing by writing complex SQL queries
  • Developed unit/assembly test cases and UNIX shell scripts to run along with daily /weekly /monthly batches to reduce or eliminate manual testing effort.
  • Used Teradata fast export to dump data from DB to flatfiles
  • Focused on data quality issues / problems dat include completeness, conformity, consistency, accuracy, duplicates and integrity

Environment: Informatica 9.6.1, Teradata, Oracle 11g, BO, HP ALM, query surge, UNIX, XML, CSV files, SQL Developer, MS Excel, Windows 10

Confidential

Informatica PL/SQL Developer

Responsibilities:

  • Worked with the business team to gather requirements for projects and created strategies to handle the requirements.
  • Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.
  • Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
  • Applied slowly changing Dimensions Type me and Type II on business requirements.
  • Worked on project documentation which included the Functional, Technical and ETL Specification documents.
  • Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
  • Developed stored procedures (procedure and functions) using PL/SQL and driving scripts using Unix Shell Scripts
  • SQL and PL/SQL engines.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like Explain Plan, SQL*Trace and TKPROF.
  • TEMPEffectively made use of Table Functions, generated columns, Indexes, Table Partitioning, Collections, and Materialized Views.
  • Used Ref Cursors, Synonyms, Indexes, Joins and Exceptions extensively in coding. Tuning of the SQL queries, which takes long time to process the request using Explain Plan, Hints to reduce the response time.
  • Worked on various tables to create Indexes to improve query performance and partitioning large Tables.
  • Worked in implementing table partitions using Range, Hash, Composite techniques.
  • Good understanding of Star Schema, Snowflake Schema, Dimension and Fact table.
  • Extensive use of Unix Shell Scripts and Autosys to automate process.
  • Used Autosys to schedule the batch jobs for Intraday and End of the business day cycles.
  • Worked on UNIX and Perl scripts for scheduling Autosys jobs in UAT/SIT/PFIX/PROD Environments.
  • Worked on Version Control tool such as Tortoise SVN and used the JIRA bug tracking tool.
  • Responsible for troubleshooting back end UAT and production issues.
  • Extensively worked on complex mappings which involved slowly changing dimensions.
  • Optimized the source queries in order to control the temp space and added delay intervals depending upon the business requirement for performance.
  • Involved in designing the Data Mart models with Erwin using Star schema methodology.
  • Extensively worked on performance tuning and also in isolating header and footer in single file.
  • Used repository manager to create repository, user's groups and managed users by setting up privileges and profile
  • Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
  • Worked on developing Change Data Capture (CDC) mechanism using Informatica PowerExchange for some of the interfaces based on the requirements and limitations of the Project.
  • Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
  • Managed performance and tuning of SQL queries and fixed the slow running queries in production.
  • Provided 24x7 production support when necessary.
  • Created batch scripts for automated database build deployment.

Environment: Informatica PowerCenter 9.6, Informatica PowerExchange 9.6, Oracle Forms 9i/10g,, CVS, TOAD, UNIX, SOAP, REST, Shell scripting, Oracle Warehouse Builder 11.2.0.2, Subversion (SVN), CVS, Perl, Erwin 5, UNIX CRONTAB, Control-M, Remedy Incident Tool, Ultra Edit, Teradata 13.

Confidential

ETL Developer

Responsibilities:

  • Interacted with business analysts, data architects, application developers to develop a data model.
  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions dat the tool supports.
  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Documentation of Technical specification, business requirements, functional specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.
  • Developed PL/SQL programming dat included writing Stored Procedures, Packages, Functions and Database Triggers.
  • Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.
  • TEMPEffectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.
  • Worked on Power Mart client tools dat are Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer in Informatica PowerCenter (10.1/9.6).
  • Developed transformation logic by writing a query in Source Qualifier to cleanse the source data (from DB2) of inconsistencies before loading the data into staging area.
  • Parsed high-level design specification to simple ETL coding and mapping standards by using expression transformation.
  • Involve in enhancements and maintenance activities of the data warehouse including performance tuning and code enhancements.
  • Experience in writing UNIX shell scripts for pre-processing and post-processing steps.
  • Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD.
  • Working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client.
  • Have generated reports using OBIEE 10.1.3 for the future business utilities.
  • Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.

Environment: Informatica 9.5.0, Oracle 10g, SQL server 2005, SQL, T-SQL, PL/SQL, OBIEE 10.1.3, Toad, Erwin4.x, Unix, Tortoise SVN, Flat files.

We'd love your feedback!