We provide IT Staff Augmentation Services!

Sr .etl Informatica Developer Resume

Orlando, FL

SUMMARY:

  • Overall 8+ years of experience in Informatica PowerCenter, Data Warehouse, Data modeling and responsible for Data Extraction, data transformation and data loading from multiple sources into the data warehouse.
  • Experience in understanding SDLC (Software Development Life Cycle), analyzing business requirements, design and development, Data Modeling, Implementation and testing of Data warehousing.
  • Experience with Erwin, strong knowledge of Relational database modeling and Dimensional database modeling concepts, star/snowflake schemas implementation, good knowledge on OLTP and OLAP systems and expertise in integrating various source’s - Oracle, Teradata and Flat files.
  • Designing Star Schemas & Snowflake Schemas for OLAP Systems and extensive understanding of Transactional and Dimensional Data Modeling concepts.
  • Work experience has given me the opportunity to acquire an in-depth knowledge of the Retail, Healthcare, Banking and Tax domains, implementation techniques, system development as well as building customer relationship skills.
  • Expertise in ETL methodology in a corporate wide ETL Solution using Informatica PowerCenter 10.1/9.x/8.x, Oracle 11g/10g applications, Teradata, SQL Server, SQL and PL/SQL.
  • Extensive experience in developing mapping transformations, workflows, worklets, mappings, Mapplets, Sessions, Tasks, schedule Workflows, sessions and slowly changing Dimensions.
  • Proficient in Implementation of Data warehouse from Requirement gathering, Data Analysis, data modeling, Application Design, development and data cleansing.
  • Good experience in Teradata using queries and utilities like BTEQ, MLOAD, FLOAD, TPUMP and Fast Export.
  • Experience in creating efficient High and low level ETL documents.
  • Hands on exposure on UNIX Environment and experience in using third party scheduling tools like Autosys.
  • Converted several informatica workflows into Hadoop, Spark and Scala.
  • Strong experience with Informatica tools using real-time CDC (change data capture) and MD5.
  • Worked on Hadoop, Hive, Scoop for landing and mart load.
  • Worked on Effort Estimation, coordination with various teams, worked with Clients, Users, Onsite team and offshore team and time management in line with the deadlines.
  • Good experience in Debugging and performance tuning of sources and SQL performance tuning, targets, mappings, transformations and sessions.
  • Experience in Task Automation using UNIX Shell Scripts, Job scheduling and communicating with Server using pmcmd, pmrep to schedule and control sessions, batches and repository tasks.
  • Excellent analytical, problem solving and communication skills with ability to interact with individuals at all levels.

TECHNICAL SKILLS:

ETL Tools: Informatica PowerCenter 10.1/9.X/8.X (PowerCenter), Informatica Data Quality 10.1, Informatica PowerExchange 10.1/9.X

Data Modeling: Physical Modelling, Logical Modelling, Relational Modelling, Dimensional Modelling (Star Schema, Snow-Flake, Fact, Dimensions), Entities, Attributes

Databases: Teradata 15/14, Oracle 12g/11g, MS SQL Server 2000, MS Access, DB2, Netezza

Languages: SQL, PL/SQL, Data Structures, Unix Shell ScriptTools SQL plus, PL/SQL Developer, SQL* Loader, Teradata SQL Assistant, TOAD, Putty

Operating Systems: Windows NT/2000/2003/XP/Vista/7, Unix, Linux

Scheduling Tools: BMC-CTRL-M, Autosys

PROFESSIONAL EXPERIENCE:

Confidential, Orlando, FL

Sr .ETL Informatica Developer

Responsibilities:

  • Develop the mappings using needed Transformations in Informatica tool according to technical specifications.
  • Implemented Informatica error handling, dynamic parameter file generations and coding best practices.
  • Responsible for Technical Design Document and Details mapping document.
  • Analyzing and understanding functional requirements.
  • Used Informatica reusability at various levels of development.
  • Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.
  • Designed and developed Informatica mappings for data sharing between interfaces utilizing SCD Type 1, Type 2 and CDC methodologies.
  • Worked on Informatica PowerCenter and used different transformations - Filter, Router, Joiner, Update Strategy (Static and Dynamic), Aggregator.
  • Created parameter files and passed connection string and parameter values using parameter files.
  • Worked on Unix shell scripting to transfer the files to different server locations.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • Developed User Defined Functions and Created Materialized Views for high availability of data.
  • Developed mapping to load Fact and Dimension tables, SCD type 1, SCD type 2 dimensions and incremental loading and unit tested the mappings.
  • Scheduled and ran Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Worked on Teradata queries, procedures and macros. Experience in Performance tuning.
  • Worked on Teradata utilities like FLOAD to load in landing, MLOAD to load in Dimension.
  • Develop informatica mappings and workflows for new change requests (CRs).
  • Developing Informatica ETL jobs to extract data from EBS and to load into the SQL Server.
  • Extensively used Informatica PowerCenter and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Created scripts for Batch test and set the required options for overnight, automated execution of test scripts
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation, variables.
  • Resolving issues raised by customer, coordinated with Users to find and resolve issues.
  • Handling Issues during development and analyzing phase and Fixing Bugs if any.
  • Preparing Test Results, Unit testing, supported SIT and UAT. Worked on Production support.

Environment:: Informatica PowerCenter 10.1, Teradata 14, Oracle 12g, My SQL, XML, Flat Files, SQL Assistant, Autosys, Toad, PL/SQL, Erwin, Unix shell scripting, DB2, Unix, Windows.

Confidential,CA

Sr. ETL Developer

Responsibilities:

  • Translated business requirements into SQL query to implement performance optimization on advertiser side.
  • Created Data Mapping Documents as per business rules applied in source data to target data.
  • Responsible for creating the Design document and design flow of the data in the project.
  • Responsible for quality of the metadata delivers business value with domain values are normalized, redundancy eliminated.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica PowerCenter components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager Tools Task Developer, Workflow & Worklet Designer.
  • Generated adhoc SQL queries using joins, Database connections and transformation rules to fetch data from Oracle database and SQL Server.
  • Data expert and SME in multiple areas that includes Consolidated Data repository, Customer Data Warehouse, IVR data repository and Metering Database.
  • Reviewed Normalized/Denormalization schemas for effective and optimum performance tuning queries and data validations in OLTP environments.
  • Responsible for developing the mapping, session and workflow as per the dataflow and load the data in Oracle database.
  • Solely responsible for migration of the ETL code to different environments - Dev, Test and Prod.
  • Created the DDL scripts and implemented in different environments.
  • Enhanced the ETL mapping to improve the performance.
  • Worked on Informatica PowerCenter and used different transformations - Filter, Router, Joiner, Update Strategy (Static and Dynamic), Aggregator.
  • Created parameter files and passed connection string and parameter values using parameter files.
  • Worked on Unix shell scripting to transfer the files to different server locations.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • Developed User Defined Functions and Created Materialized Views for high availability of data.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Scheduled and ran Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Worked on PowerExchange CDC to capture data in real-time. Created registration and data map for DB2 tables and Cobol files.
  • Supported the project in different phases - Dev, SIT, UAT and PROD.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
  • Created Groups, roles, privileges and assigned them to each user group.
  • Creation and maintenance of Informatica users and privileges.
  • Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.
  • Ensure that all support requests are properly approved, documented, and communicated using the MQC tool. Documenting common issues and resolution procedures.
  • Developed and modified jil files for Informatica workflows and scheduled in Autosys scheduling tool.
  • Monitored the incremental data loads and scheduled jobs in Autosys in different phases of the project.
  • Supported the project during warranty phase and fixed critical issues in time.

Environment: Informatic PowerCenter 9.6, PowerExchange 9.6, Teradata 13, oracle 11g, IBM DB2, Tablea, Data Ladder IBM clear case, Erwin Data modeler.

Confidential, New York, NY

ETL/Informatica Developer

Responsibilities:

  • Involved in gathering requirements from business users on day to day basis.
  • Worked on analyzing systems data, stored on an enterprise data warehouse (EDW)
  • Provided documentation for business requirements, data mapping, and production release.
  • Developed ad-hoc queries on Teradata system for summarizing business operations and trends.
  • Developed number of complex mappings using lookups (connected - unconnected), Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, normalizer, sequence generator, Update Strategy, mapplets to implement the business logic and load the data incrementally.
  • Developed ETL and Data Quality mappings to load and transform data from source to DWH using Informatica PowerCenter 9.6 and IDQ.
  • Prepared ETL mapping specification documents and Design Documents. Created ETL workflows, sessions and mappings.
  • Scheduling jobs, creating Workflows variables or parameters dynamically, create and run Workflow task, create Workflow parameter files etc.
  • Design Dimension and Fact tables using Erwin data modeler.
  • Developed Teradata utilities to load external file data to custom table space for data mining.
  • Created cypher queries for real-time link analysis to establish patterns between data sets.
  • Identified data correlations across vendor data and business data using data matching tool.
  • Worked on BTEQ scripts to push data into Teradata tables.
  • Created job scripts using Teradata Parallel transporter (TPT) from Files or Database, developed data flows through multiple instances of UPDATE operators to update tables.
  • Developed Fast Load scripts to Load external source data into warehouse for analysis.
  • Worked on Tpump and Multi Load as per Customer requirement to increase the performance while updating records.
  • Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Autosys scheduling tool.
  • Created a minimum viable product (MVP) data model for statistical insight to business users:
  • Identified data locations and variables, created draft data architecture and data flow.
  • Developed complex queries on DB2 OLTP database and pushed the results to Teradata OLAP database for analytics.
  • Connected Teradata database to create Tableau pages and created rules for alerts.
  • Customized data models and tuned SQL queries for better database performance.
  • Worked on Version control software IBM clear case to update statistical models, queries and documentation.
  • Performed multidimensional analysis of business data, interpretation results and generate trend analysis or predictive analysis.
  • Understanding data behavior for real time analytics on IBM DB2 transactional system for statistical analysis.
  • Worked on code optimization, end to end testing with all possible test case scenarios.
  • Part of stand-up meetings, sprint retrospective and planning as part of agile software development methodology.
  • Re-write existing scripts as part of discovery program for business taxes like sales, use etc.,
  • Developed dashboards on Tableau for business users, to identify fraud patterns in tax filings season to release refunds and pulled production data from IBM DB2 to Teradata database using parallel transporter, worked on BTEQ scripts to push data into Teradata tables.
  • Provided quick production fixes and proactively involved in fixing production support issues.

Environment: Informatica PowerCenter 9.1, Informatica PowerExchange 9.1, Tableau Oracle 10g, COBOL Copybook, Netezza, Teradata 12, Flat Files, SQL developer, UNIX Shell Programming.

Confidential, Pittsburg, KS

Informatica ETL Developer

Responsibilities:

  • Extensively involved in business requirements gathering.
  • Involved in design sessions to translate user requirements to technical specs.
  • Designed and developed complex Informatica mappings in conjunction with UNIX scripts to load source data into Data Warehouse by applying business logic.
  • Used most of the informatica provided functionalities (Transformations: Expression, Aggregator, Filter, Router, Sorter, Lookup, Stored procedure, Normalizer etc other objects like Mapplets, Pre-Sql, Post-Sql, Pre-Session, Post-Session) to create complex process to meet the business requirements.
  • Built complex processes to load inconsistent data from flat files.
  • Created sessions and workflows for designed mappings. Redesigned some of the existing mappings in the system to meet new functionality.
  • Created and used different reusable tasks like command and email tasks for session status.
  • Used Workflow Manager to create Sessions and scheduled them to run at specified time with required frequency.
  • Monitored and configured the sessions that are running, scheduled, completed and failed.
  • Created test case, participate in System integration test, UAT and migrated code from Dev to Test to QA to Prod.
  • Compiled report presentations using tools like Business Object reports.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to fire off services and sessions.
  • Designed and Developed complex process for reporting on all source files received from various vendors.
  • Developed complex regexp in informatica to validate email address.
  • Automated existing Informatica workflows to run every day in UNIX.
  • Created and executed queries on repository to report on the Informatica objects.

Environment: Informatica PowerCenter 9.1.0, DB2, Oracle 11g, Informatica PowerExchange 9.1.0, Linux, Mainframe, 12, SYBASE, OBIEE 11g.

Confidential, Bentonville, AR

Informatica Developer

Responsibilities:

  • Designed complex mappings involving target load order and constraint-based loading.
  • Uploaded data from operational source system (Oracle 10g) to Teradata.
  • Wrote UNIX shell scripts to work with flat files, to create pre-and post-session commands and scheduling workflows.
  • Worked with DBAs and Data Architects in planning and implementing appropriate data partitioning strategy for Enterprise Data Warehouse.
  • Developed stored procedures, database triggers and SQL queries where needed.
  • Designed the ETL processes using Informatica to load data from DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Oracle Data Warehouse database.
  • Extensively used XML source qualifier, Mid-stream XML parser transformation, Mid-stream XML generator transformations, as our main source type was XML Files.
  • Extensively used Dynamic Lookup transformation and Update Strategy transformation to update slowly changing dimensions.
  • Creation of data mart and extracted data from Oracle, using Informatica Power Centre
  • The Load Ready File is loaded into Teradata Table using Teradata utilities like Fast load Multiload, Bteq, and Tpump connections. The Load Date and Load Time are also captured in the Teradata table.
  • Architected and developed Fast Load and MLoad scripts in control file, developed BTEQ scripts to process the data in staging server.
  • Written SQL overrides in source qualifier according to the business requirements.
  • Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.
  • Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
  • Using Metadata manger to find out the relations and load different tools like Informatica, DB2, Oracle.
  • Responsible for Error Handling and bug fixing.

Environment: Informatica PowerCenter 8.6.1, Oracle 10 g, Mainframe DB2, XML.

Hire Now