Sr.bi Etl Developer Resume
GA
SUMMARY
- Around 8 Years of Professional experience in different areas of Data warehousing, RDBMS, Data modeling, Data extraction, Data acquisition, Analysis, Application design, Development and Implementation of data warehousing, business Intelligence system using Informatica Power Center, Data Conversion and various business applications in different platforms.
- Informatica Power Center 9.6/9.1/8.6/8.5/8.1/7.1 Oracle 11g/10g/9i (SQL/PL/SQL), Extensive experience in System Analysis, Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in Pharmaceutical, Finance and Insurance industries.
- Experience in Dimension Data Modeling concepts like Star Join Schema Modeling, Snow - Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
- Worked in Data Conversion of various data sources like Oracle, MS SQL Server, Oracle Apps, MS Access, Fixed Width and Delimited Flat Files, COBOL files and XML Files
- Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Worked on all the transformations like Connected and Unconnected - Remote Lookup, Aggregator, Expression, Filter, Router, Update Strategy, Connected and Remote Stored Procedure and Sequence Generator, Sorter
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
- Knowledge of Advanced Programming for data transformation (JAVA) Created ETL mappings using Informatica Power center to move Data from multiple sources like Flat files, Oracle, DB2 into a common target area such as Data Marts and Data Warehouse.
- Experienced in implementing multiple end to end MDM solutions using- Informatica master data management (MDM), IDQ, IDE, IDD, Power Center
- Extensive knowledge with Teradata SQL Assistant
- Developed Teradata BTEQ scripts to Load data from Teradata Staging area to Data Warehouse, Data Warehouse to data marts for specific reporting requirements
- Experience in PL/SQL Programming (Stored procedures, Triggers, Packages) using Oracle (SQL, PL/SQL), SQL Server and UNIX shell scripting to perform job scheduling
- Quick learner with Analytical and logical programming skills and possess excellent presentation and interpersonal skills.
- Implementing large projects to include effective data integration & data virtualization solutions
- Good communication and interpersonal skills and Strong understanding of ETL best practices.
- Assigning work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.6.1/9.5/8.6/8.5/8.1 ( Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server)Power Exchange,Tabule, Webservices,IDQ
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2012/7.0/6.5 , Sybase, Teradata, MS Access 7.0/2000, Netezza, SQL Server (DTS/SSIS/SSRS)
Data Modeling tools: ERWIN 4.x and Concepts like Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
Development Tools: SQL *PLUS, TOAD, SQL Navigator.Teradata,SQL DEVELOPER
Programming Languages: Unix Shell Scripting and Perl Scripting, Batch Scripting, VB.NET 05/08, XML, HTML, C, C++, UML (Rational Rose), Java, Oracle PL/SQL OBIEE, (Stored Procedures, triggers, Indexes)
Packages: MS Word, MS Excel, MS Project, MS Visio, MS PowerPoint
Scheduling Tools: AUTOSYS, Tivoli, Control-M,ESP,MASTERO,DAC
Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS, IBM S/3,MVS/ESA
PROFESSIONAL EXPERIENCE
Confidential, GA
Sr.BI ETL Developer
Responsibilities:
- Understanding the existing subject areas, source systems, target system, operational data, deployment processes and Production Support activities
- Design queries for marketing data marts and then figure out how to make these queries yield comparable data.
- Discover the source data that causes the problems downstream.
- Listen intently to a manager, translating what he says into data fields.
- Worked closely with Business Object’s reporting team in order to meet users/business Adhoc and other reporting needs
- Designed the overall ETL solution including analyzing data, preparation of high level, detailed design documents, test plans and deployment strategy
- Proactively resolved open defects against MDM and other subject areas.
- Carried out the testing strategy/validations against MDM subject area by implementing key test cases
- Extracted/loaded data from/into diverse source/target systems like Teradata, Oracle, SIEBEL CRM XML and Flat Files.
- Analyzed the data models of the source & target systems to develop comprehensive mapping specifications
- Execution of SQL queries to extract data from DB2 tables for running test scripts.
- Raise tickets/issues with respective teams in the event of production failures and get in touch the respective teams to get the issues fixed as earliest as possible.
- Worked on Infomatica Data Quality to resolve customers address related issues.
- Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems
- Developed Perl and Shell scripts for upload of data feed into database
- Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
- Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
- Involved in defining the source to target data mappings, business rules and data definitions.
- Carried out Data Analysis for mapping of all sources of Data involved
- Identifying and removing Bottlenecks to improve the performance of Mappings, Sessions and Workflows using partitioning, fixing bottlenecks at source and target level.
- Designed database & UNIX Scripts needed to meet the system requirements.
- Designing the documents and configure the Informatica MDM Hub to support loading, cleansing, matching, merging, and publication of MDM data.
- Optimizing SQL queries for better performance using Hints, Indexes and Explain Plan working closely with Business Leaders, Functional Team and Technical activities for all operational activities leading to development
- Extensive experience working in an Agile development environment
- Created new tables, stored procedures for Application Developers and some user defined functions Created SQL scripts for tuning and scheduling
- Responsible for Developing OLAP Cubes, Data Source View, Partitions of queries
- Used various Transformations in SSIS Dataflow, Control Flow using for loop Containers and Fuzzy Lookups etc
- Used SSIS package to populate data from excel to database, used lookup, derived column and conditional split to achieve the required data.
- Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD
- Performed data conversions from flat files into a normalized database structure
- Experience in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks
Environment: Informatica Power Centre 9.6/9.5, Oracle 11g, Sql Server 2012,PL/SQL,SIEBEL, CRM,UNIX OBIEE, DAC,Teradata.
Confidential, Iowa
ETL Developer/Production support
Responsibilities:
- Source data analysis, table design using ESP, improved cube designs for ease of loading on straight move.
- Changed Design and mappings as needed to improve performance. Business analysis and data model design for new tables related to ETL. Design and create database tables needed for application
- Implemented upgrade plan including testing of all production mappings, sessions, workflows. Modified mappings, sessions, workflows as needed to get them to work as designed in upgraded environment
- Started working on Delta File Creation (Pre-Stage) in Hadoop using Informatica
- Responsible for all activities related to the development, implementation, administration and support of ETL processes for large scale data warehouses using Informatica Power Center
- Extensive experience working in an Agile development environment
- Worked on all phases of data warehouse development lifecycle, ETL design and implementation, and support of new and existing applications
- Expert in writing T-SQL, working on DTS, SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Migrated all DTS packages to SQL Server Integration Services (SSIS) and modified the packages accordingly using the advanced features of SQL Server Integration Services.
- Played a major role in production support of SSAS cube and SSIS jobs
- Experienced on working with Big Data and Hadoop File System (HDFS)
- Implemented Proof of concepts on Hadoop stack and different bigdata analytic tools, migration from different databases( i.e. Teradata, Oracle, MySQL) to Hadoop
- Loaded the dataset into Hive for ETL (Extract, Transfer and Load) operation
- Maintaining Tableau reports & analysis and work on enhancements to existing reports or creating new reports based on business needs.
- Design, develop, unit test, and support ETL mappings and scripts for data marts using Talend Checking & fixing delimiter in ASCII Files.
- Utilized existing Informatica, Teradata, Sql Server, SalesForce Integration and UNIX to deliver work and fix production issues on time in fast paced environment
- Responsible for security, backup, recovery, modifications, and new DB2 databases
- Strong hands on experience using Teradata utilities skills in coding and debugging Teradata utilities like Fast Load, Fast Export, MultiLoad and Tpump for Teradata ETL processing huge volumes of data throughput
- Created mappings using pushdown optimization to achieve good performance in loading data into Netezza
- Involved in the performance tuning of Informatica mappings by using .Informatica Push Down optimization (PDO), which provides the sameperformance impact as a BTEQ script but also provide a good data lineage
- Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica
Environment: Informatica Power Center 9.6, Power Exchange, Oracle11g, ESP, Tableau, Sales Force, PL/SQL, UNIX, Teradata, Hadoop, Netezza, Talend, Qualitycenter
Confidential, Atlanta, GA
BI Developer/Production support
Responsibilities:
- Parsed high-level design specification to simple ETL coding and mapping standards
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Monitor system integrity through daily system checks, to ensure a high standard of service is provided to clients
- Remain calm under pressure and ensure IT management and business stakeholders are kept fully updated on progress of critical issues
- Escalate issues and remediation steps effectively and in a timely manner.
- Perform troubleshooting analysis and resolution of critical applications and batch processes.
- Support the Client Reporting Production Environment as a first priority.
- Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions
- Experience in using Automation Scheduling tools like Autosys and Control-M
- Modified reports and Talend ETL jobs based on the feedback from QA testers and Users in development and staging environments.
- Create change management strategy .Identify, analyze, prepare risk mitigation tactics Identifying and manage anticipated resistance
- Developed Transformation Logic to cleanse the source data of inconsistencies before loading the data into staging area which is the source for stage loading
- Participated in the full development lifecycle within the reporting framework, from developing business intelligence requirements to QA and production deployment
- Written SQL Scripts and PL/SQL Scripts to extract data from Databases Power Exchange Change Data Capture has been done for data updates
- Prepared migration document to move the mappings from development to testing and then to production repositories
- Expertise in transforming data imported from disparate data sources into analysis data structures, using SAS functions, options, ODS, array processing, macro facility, and storing and managing data in SAS data files.
- Worked on Informatica Data Quality to resolve customers address related issues. Worked on Informatica On Demand to mainly import data from Sales Force
- Loading data in Dimension Tables in SQL Server using SSIS Packages
- Expert in using OBIEE Answers to create queries, format views, charts, and add user interactivity and dynamic content to enhance the user experience.
- Developed data models according to company standards Used DB2 Connect to extract multiple database source systems to single analytical system for large databases and also 24*7 Production support for data warehousing
Environment: Informatica Power Center 9.6, Power Exchange, Seibel OBIEE,Talend 4.1, DB2 Exchange, Oracle 11g, Control-M, Obiee, SalesForce, PL/SQL, UNIX.
Confidential, Caramel, INDIANA
Informatica ETL Developer
Responsibilities:
- Participated in design sessions where decisions are made involving the transformation from source to target
- Responsible for developing various data extracts and loading routine using Informatica, Oracle stored procedure (pl/sql), packages, triggers,
- Managed and compile documentation for all ETL processes.
- Created detailed process flow designs and drafts high-level Standard Operation Procedures.
- Performed Informatica Administration tasks
- Performed tuning for PL and SQL procedures, Views, and Informatica objects
- Designed and implements the error handling strategy for ETL team.
- Monitored data warehouse month-end loads to ensure successful completion
- Mappings, Mapplets and Sessions for data loads And data cleansing. Enhancing the existing mappings where changes are made to the existing mappings using Informatica Power center
- Performed match/merge and ran match rules to check the effectiveness of MDM process on data. Created Unit Test Case document
- Carried out the testing strategy/validations against MDM subject area by implementing key test cases
- Gather and analyze business requirements.
- Design logical and physical data models and data processing infrastructure.
- Provide production database support, enhance existing system and design and develop new systems.
- Monitor the usage of database space and performance.
- Performance optimization of the database and applications
- Involved in ETL process from development to testing and production environments
- Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.
Environment: Informatica Power Center 9.1Power Exchange, Erwin, Business Objects 6.5,Cogons, Tivoli, Oracle 9i, DB2, XML, TSQL, MS SQL Server 2005, UNIX (AIX), UNIX Shell, SalesForce.
Confidential, New York City, NY
ETL Informatica Developer
Responsibilities:
- Analyzed source data and gathered requirements from the business users. Informatica Power Center 8.5 is used to extract, transform and load data from different Operational data sources like Oracle as an ODS, DB2 and flat file.
- Coordinating with source systems owners, day-to-day ETL progress monitoring, Data warehouse Target schema design (star schema) and maintenance.
- Experience working on performance tuning and optimization of Teradata queries.
- Involved in Data Migration between Teradata, MS SQL server, DB2 and Oracle Experience in Database Programming, SQL, PL/SQL (Stored Procedures, Functions and Triggers) for faster transformations and Development using Teradata and Oracle Used mapping Parameters and Variables to pass the values between sessions.
- Developed mappings with Transformations Mapplet confirming to the business rules
- The present system generates reports by using Java application. Participated in the installation and configuration of Oracle BI EE 10.1.3.4.1 in the windows as well as the Linux environment Performance tuning of sources, targets, mappings and SQL queries in the transformations
- Worked on Database Triggers, Stored Procedures, Functions and Database Constraints Performing Unit, Integration and Regression testing and preparing Test Cases for the same Coordinating work offshore team.
Environment: Informatica Power Center8.5v (Repository Manager, Designer Tools like Source Designer, Warehouse Designer, Mapping and Mapplet Designer, Workflow Manager, Workflow Monitor) Power exchange, DB2 8.0, Oracle 10g, UNIX Shell Scripting, Autosys, Java, and Java scripts, Obiee.
Confidential
ETL Informatica Developer
Responsibilities:
- Worked with the Business analysts and the DBA for requirements gathering and project coordination. Design and Development of ETL routines using Informatica Power Center 8.1
- Within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Maplets, connected and unconnected stored procedures / functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done
- Design and Development of data validation and load processes (using PL/SQL, SQL) and pre-session, post-session routines and batch execution routines using UNIX Shell Scripts Implemented slowly changing dimensions
- Logical and Physical Data Modeling (relational and multi-dimensional) using ERWIN and used Power Plug to forward engineer to Informatica Repository Performance Monitoring and tuning of SQL using Explain Plans and Oracle Hints and tuning of Informatica mappings using SQL overrides and source filters and managing cache file allocations and sizes. Used TOAD to run SQL queries
- Extensively used SQL*Loader to load data from flat files to database tables in Oracle Involved in Data Extraction from Oracle and Flat Files using SQL Loader.
- Developed and tested all the informatica Data mappings, sessions and workflows - involving several Tasks.
- Generated Adhoc reports using Query studio in Cogon’s Strong Experience writing SQL Queries, PL/SQL Procedures in Oracle Database, DB2 UDB. Implementation of stored procedures and packages utilizing PL/SQL for the enhancements of the existing application level data warehouse
Environment: Informatica Power Center 8.1, Oracle 10g, Cogon’s, UNIX (HP-UX 11.x), Sun Solaris (2.6), Windows 2K, UNIX Shell Scripts, PL/SQL, DB2 Connect, TOAD