Etl/bi Developer Resume
Charlotte North, CarolinA
SUMMARY
- 8 years of IT Experience in Data Warehousing, Database Design and ETL Processes in the Test and Production environments of various business domains like finance, manufacturing and health care industries.
- Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center.
- Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Teradata, Flat Files, Excel and XML files loaded data in to Data ware house and Data marts using Power center.
- Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers and Packages.
- Worked extensively in creating complex Mappings using various transformations like Unconnected and Connected lookups, Source Qualifier, Joiner, Rank, Sorter, Router, Filter, Expression, Aggregator, Joiner and Update Strategy, Normalizer.
- Very good knowledge in data warehouse designs using Star Schema and Snow Flake Schema Architectures, Fact and Dimension Tables, UML concepts, OLTP and OLAP applications Good Experience in Configuring Dataware Administration Console (DAC). In - depth understanding of Data Warehousing and Business Intelligence concepts.
- Designed and developed efficient Error handling methods and implemented throughout the mappings in various projects.
- Scheduled sessions and batches on the Informatica Server using Informatica Server.
- Developed Unix Shell Scripts and PL/SQL procedures. Extensively used UNIX shell scripts to create the parameter files dynamically and scheduling jobs using Autosys.
- Experienced in various data sources like DB2, SQL Server, Oracle, Fixed Width and Delimited Flat Files.
- Good experience in Data Modeling with expertise in creating Star & Snow-Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modelling using Erwin.
- Worked in the Requirement Analysis,DataAnalysis, Application Design,DataModeling, Application Development, payment data and Transactional data at various Business Application Systems of Financial (JPMC), Confidential .
- Had an experience working with pricing and contracting at JPMC client.
- Strong experience in writing UNIX Shell scripts, SQL Scripts for Development, Automation of ETL process, error handling and reporting purposes.
- Knowledge with Informatica DVO, IDE/IDQ tools for Data Analysis / Data Profiling and Data Governance.
- Used Teradata utilities like for Teradata ETL processing huge volumes of data throughput Fast Load, Fast Export, MultiLoad, Tpump and also Involved in implementation and batch monitoring.
- Captured the DQ metrics using the Profiles and Created scorecards to review data quality using IDQ.
- Assisted in troubleshooting the Production support problems, which are related with Teradata database and IDQ.
- Knowledge on C++, python and UNIX extensively in completing the project assignments on time
- Knowledge on working with SFTP transfers using the SSH keys.
- Used Basic Teradata Query utility for reporting Purpose. Built the Operational data store by using BTEQ, Teradata, Oracle, SQL, and UNIX.
- Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads, Incremental Loads, Daily loads and Monthly loads.
- Analyzed the Data Distribution and Reviewed the Index choices. Used Teradata View Point for monitoring loads, using SQL rules developed SQL scripts to improve load performance.
- Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
- Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.
TECHNICAL SKILLS:
Programming Languages: C, C++, SQL, PL SQL, UNIX Shell Scripting, SSH.
Application Development: Informatica Power Center 9.x,8.x, OLAP, OLTP, SQL*Loader, Informatica Power Connect for DB2, IDQ, SQL*Plus, ERWIN, IBM DB2, Toad, Oracle 11g, 10g and SQL server, Putty.
RDMS: Oracle, SQL Server, IBM DB2.
Databases/DB Tools: Oracle, IBM DB2, SQL Server, Teradata, SSIS
Methodologies/Techniques: Relational and Dimensional (Star Schema) Data Modeling, Data Warehouse and Data Mart Design, Water fall/ Agile Methodology.
Testing Tools: Knowledge in Manual DW Testing using SQL.
Utilities/ Command Languages: IBM DB2 Load Utility, IBM DB2 Export.
PROFESSIONAL EXPERIENCE
Confidential, Charlotte, North Carolina
ETL/BI Developer
Responsibilities:
- Worked in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
- Involved in Developing the solutions for business challenges by leveraging UNIX scripting, Oracle sql, and Autosys tools and environment.
- Worked with business analysts to identify and understand requirements and translated them into ETL code in Requirement Analysis phase.
- Worked on enhancements by creating DARBI Request and ALM.
- Worked closely with Business Users regarding the volume issues of GE Lift and Shift Project.
- Used ETL process to load data from multiple sources to staging area (Oracle) using Informatica Center 9.6.1.
- Worked on identifying the flow of jobs and analysis of BO reports in GE lift and Shift project.
- Worked in 24/7 production support, ETL executions and resolving root causes.
- Worked on creating the Problem tickets and Work Requests to work on the production issues.
- Designed various Mappings for extracting data from various sources involving relational tables & flat files.
- Member of On-call team for providing supportfor daily, weekly and monthly batch loads.
- Developed and Scheduled jobs for daily, Monthly, Weekly loads by creating JIL Scripts in CA Workload automations by creating the scripts that invoked Informatica workflows & sessions associated with the mappings.
- Worked on SFTP project in creating the new JIL scripts to transfer the files every day by creating the shell scripts to call the files from the folder.
- Involved in migration of SFTP to DTS transmission process by creating the new connections and folders.
- Created SQL scripts to drop & recreate the indexes on source and target tables by calling it with Autosys scheduling jobs.
- Performed fine tuning of existing mappings in order to increase the performance of the mappings.
- Involved in implementation ofInformatica and Unix shell script code to SIT, UAT and Production environment.
- Involved in documenting the database changes, informatica changes and JIL scripts. Worked on post-install validation scripts and roll-back of database changes.
- Intensively worked on the Debugger from within the Mapping Designer to troubleshoot predefined mapping.
- Worked extensively on Informaticatransformations like Source Qualifier, Expression, Filter, Router, Union, Aggregator, Lookup, Update Strategy, etc.
- Involved in creation of Mapplets, Worklets, Reusable transformations.
- Involved in migrating the Mappings, Sessions, Workflows from Test environment to Production environment.
- Worked on Error Handling, Scheduling, Identifying ETL Job Dependency and Recovery mechanisms.
- Redesigned and migrated existing mappings in informatica power center to meet new requirements.
- Worked on mapping documentation by identifying the sources, targets, hashed files, Tranformation stage and logics in between by using Data Stage XML Reports.
Environment: Informatica 9.6, Oracle 11g, sql oracle developer, PL/SQL, SQL Server, UNIX, Autosys Scheduling job, Shell Scripting, Water Fall/Agile Methodology and flat files.
Confidential, Salem, NH
ETL/SQL Developer
Responsibilities:
- Involved in gathering and analyzing the requirements with IT Functional and Technical product managers and Business Users(BAU’s).
- Prepared source to target data mapping and business rules for the ETL processes.
- Implemented restartability framework to recover failed sessions.
- Worked on data integrity and data validation checks for ETL load
- Worked on exceptions handling and exception reprocessing for ETL load.
- Created Time Estimate Proposal document with estimation of hours required for completion of each ETL task.
- Designed, Developed, Documented and Build Teradata UNIX scripts for the FastLoad, FastExport, BTEQ and MLoad Utilities to load data into Teradata tables.
- Conversion of business requirements into technical documents - Business Requirement Document, explained business requirements in terms of technology to the developers.
- Worked with data modeler and business users for designing of tables.
- Data integrity maintained through session control.
- Worked as a BCBS project (SME).
- Worked on data masking to hide the data of credit transaction in production
- Performed datacleansing, massaging, masking, IDQ and data profiling of data.
- Used structured and Un-Structured Data like PDF files, spreadsheets, Word documents and print streams option to get normalized data using B2B Exchangeof informatica.
- Used ODI to effectively integrate heterogeneous data sources and converting raw data into useful information.
- Worked closely with Architect and Data Modeler to enhance the existing Model with the new attributes as per the Business needs.
- Hands on experience on working with ODIKnowledge Modules like LKM, IKM and JKM, CKM.
- Worked on building and modifying queries in Visual Studio Team Foundation Server Environment.
- Responsible for creating batches & scripts for implementing logical design to T-SQL.
- Extended the base ODIKnowledge Modules to create customized versions that are re-used across other data integration projects.
- Implemented Change Data Capture using Informatica Power Exchange 9.1.
- Deployed data stage jobs to various environments like Test, UAT.
- Used ODI Designer as a data mapping/business rules implementation and developed mappings and workflows to transfer data from source system data to target systems.
- Worked on loading the XML files and Flat files in IRS project.
- Configured and used B2B data exchange for end to end datavisibility through event monitoring and to provide a universal data transformation supporting numerous formats, documents, and filters.
- Worked on unit testing, documented positive and regression testing on the mappings. Worked on workflow design documents and process control documents on the workflows.
- Created mappings to load dimensions and fact tables from multiple source systems like Oracle, SQL server, flat files, etc. for the data warehouse. Worked in a critical project EIS to built data warehouse and also involved in Data Mart.
- Extensive knowledge in Data Analysis, Data Requirement gathering and Data Mapping for ETL processes and the scheduler tool Control-M.
- Well versed in OLTP DataModeling,Datawarehousing concepts.
- Involved inDatamodeling to determineDatadefinitions and establish referential integrity of the system.
- Worked on full integration testing Informatica DVO.
- Worked on Value tests, SQL views, join views to test data.
- Resolved errors in Data Validation Option.
- Worked on setting up table constraints and associated tests in DVO.
- Strong experience in Data Warehousing, Data Integration testing using DVO
- Experience in Agile methodology as a lead and worked with offshore team.
- Designed and developed complex mappings involving SQL Transformation, Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc.,
- Created reusable transformations, sessions, mapplets to simplify ETL processes.
- Used PLSQL Stored Procedures for truncating the tables, drop partitions, gather stats through informatica mappings.
- Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them.
- Developed back-end PL/SQL packages, building Unix Shell scripts for data migration & batch processing.
- Experience in working with LINUX Shell Scripts for handling data coming in Flat files.
- Extensively used Informatica Debugger for trouble shooting purpose using data breakpoints.
- Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, write UNIX scripts to validate.
- Hands on experience on business integration development.
- Worked with DBA’s on creation of tables, indexes, stored procedures, partitions for the data model identified.
- Created Data Lineage through Informatica Metadata Manager
- Worked with PVCS (Polytron version control systems), GIT for moving ETL and SQL code.
- Used Batch Control to schedule the Control M jobs by calling ETL informatica workflows
- Worked with file watchers, cyclic, hourly, daily, monthly, annual and on-demand jobs in CTM
Environment: Informatica Power Center 9.1/ 9.6/10.1 , Informatica DVO 9.1, IDQ 9.6, Oracle 11g, SQL Server 2012, ODI 12C, Informatica B2B Exchange, T-SQL, Tableau, Putty, JIRA, Teradata, Toad10.6, SQL Loader, MS Visio, Control-M, MDM, oracle EBS, WinScp tool, Unix/LINUX Shell scripting, Flat files, ERWIN, GIT, PVCS, Share point, Confluence.
Confidential, Charlotte
Informatica/Database Developer
Responsibilities:
- Responsible for gathering the requirements both functional and technical and documentation of the same.
- Worked with the business analysts in requirement analysis to implement the ETL process.
- Involved in Low-level Design for the scripts of the database sequences, constraints, triggers and stored procedures.
- Extensively used Informatica to load data from various Data Sources like Flat files, Oracle, SQL Server, into the Enterprise Data Warehouse.
- Extensively worked on confirmed Dimensions for the purpose of incremental loading of the target database.
- Development of scripts for loading the data into the base tables in EDW using Fast Load, Multiplied and BTEQ utilities of Teradata.
- Worked on ODS, Data mart and also involved in designing the ETL Uses Cases.
- Performed Unit, Systems and Regression Testing of the mappings. Involved in writing the Test Cases and also assisted the users in performing UAT.
- Extensively used Full Pushdown Optimization in informaticato performance for NetezzaDB loads.
- Thorough understanding of External tables, Distribution, GROOM and other features ofNetezza.
- Used Structured and Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of informatica.
- UsedInformaticaB2BData Exchange to Structured data like XML.
- Configured and usedB2B data exchangefor end to enddatavisibility through event monitoring and to provide a universaldatatransformation which supports numerous formats, documents, and filters.
- Guided and coached multiple team members to utilize agilemethodology and provide constructive performance feedback.
- Responsible for creating complex mappings according to business requirements, which can be scheduled through ODI Scheduler.
- CreatedODI Packages, Jobs of various complexities and automated process data flow.
- Good working knowledge withODI Operator for debugging and viewing the execution details of interfaces and packages.
- Worked on data modellingand produced datamapping and data definition documentation.
- Build Data stage jobs to Transforming, integrating, and loading data into data warehouse database using various stages which has available in data stage like merge, lookup, change capture and transformer stage.
- Conducted Design discussions and meetings to come out with the appropriate DataModel
- Designed a logical datamodel using database tool Erwin 8.
- Extensive experience in identifying performance issues, debugging and performance tuning using SQL Trace, explain plan, Indexing, Partitioning, DBMS Profiler and OracleCode Coverage Utilities.
- Developed several IDQcomplex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
- Designed/Developed IDQ reusable mappings to match accounting data based on demographic information.
- Worked with IDQon data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
- Performed in creating tasks in the workflow manager and exportedIDQmappings and executed them.
- Worked onIDQ parsing,IDQStandardization, matching,IDQweb services.
- Imported the mappings developed in data quality (IDQ) to Informaticadesigner.
- Worked on Performance Tuning of ETL Procedures and processes.
- Developed OLAP models for analysis of facts, measures, dimensions and hierarchies.
- Tuning of the mappings for a better response.
- Used Teradata utilities FLOAD, MLOAD, TPT to load data.
- Wrote and executed Teradata SQL, with respect to the SQL reports and recent business requirements. documentation on business rules and calculating scripts.
- Extensively used Perl scripts to edit the xml files and calculate line count according to the client's needed.
- Developed parameter and dimension based reports, drill-down reports, matrix reports, charts, and Tabular reports using Tableau Desktop.
- Executed and tested required queries and reports before publishing dashboards.
- Read business user requirements, analyzed data, and designed software solutions in Tableau Desktop based on the requirements.
- Informatica B2B Extensively Worked in Processing Structured and Unstructured data.Informatica B2B Data Transformation supports transformations and mappings, via XML.
- Extensively worked with Slowly Changing Dimensions Type1, Type2, for data Loads.
Environment: Informatica Power Center 9.5, IDQ 9.6, ODI 12C, Informatica B2B Data exchange, ERWIN, SSIS, ORACLE11g, PL/SQL, Netezza, AGILE/SCRUM, TOAD, Reports 9i/10g, SQL Server 2012, Flat Files, Tableau 9.1, Perl scripting, Python, Teradata v13, Autosys, GIT, UNIX Shell Scripting, Windows XP, LINUX.
Confidential, Pleasanton, CA
ETL Informatica Developer
Responsibilities:
- Responsible for documentation, version control of schema and version release.
- Analyzed specifications and identified source data needs to be moved to data warehouse, participated in the Design Team and user requirement gathering meetings.
- Participated in the analysis of development environment of Extraction process and development architecture of ETL process.
- Coordinated with customer in finding the sources and targets for data conversion.
- Analyzed the source data with business users, developed critical mappings using Informatica Power Center to load the data from DB2 to Oracle.
- Worked with federal government sponsored healthcare insurance programs (Affordable Care Act (ACA), SNAP, TANF, Medicaid, PHI)
- Experience with On-Line Transaction Processing & Work Flow.
- Ability to tabulate, extract, graph and analyze data, demonstrates strong analytical/statistical skills
- Scheduled sessions and batches on the Informatica Server using Informatica Server. Developed Unix Shell Scripts and PL/SQL procedures.
- Extensively used LINUX shell scripts to create the parameter files dynamically and scheduling jobs using Autosys.
- Extensively involved in the ETL storage and partitions, aggregations and calculation of queries with MDX, Data Mining Models, developing reports using MDX and T-SQL.
- Used ODI Designer to develop complex interfaces (mappings) to load the data from the various sources into dimensions and facts.
- Implemented the Change Data Capture (CDC) feature of ODI to minimize the data load times.
- Creating complex ETL packages in SSIS/Informatica tool.
- Created Reusable transformations and Mapplets for use in Multiple Mappings.
- Extracted data from SQL server Source Systems and loaded into Oracle Target tables.
- Used SQL Loader Utility for moving data between source systems.
- Involved in the migration of existing ETL process to Informatica Power center.
- Healthcare/Payer industry experience: I worked for a year in health care programs like ACA), SNAP, TANF
Environment: Informatica Power Center 9.1, SSIS, Power Exchange, Workflow Manager, Workflow Monitor, PL/SQL, DB2, Oracle, T-SQL, SQL Loader, Flat Files, ODI (Oracle Data Integrater) 11g, Business Objects, LINUX, Win 2000.
Confidential, Kansas City, MO
SQL/ETL Developer
Responsibilities:
- Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
- Developed logical and physical data models that capture current state/future state data elements and data flows.
- Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Mapplet Designer.
- Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.
- Coordinated with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
- Datamodeling experience using DimensionalDatamodeling, Star Schema modeling, Physical & logicalDatamodeling in ERWIN.
- Designed and developed reusable Transformations for ETL using Informatica Power Center 8.6.1.
- Worked extensively on different types of transformations like normalize, expression, union, filter, aggregator, update strategy, lookup, stored procedure, sequence generator and joiner.
- Designed and developed Complex mappings like Slowly Changing Dimensions Type 1, Type 2.
- Extensively worked in performance tuning of programs, ETL procedures and processes.
- Developed and invoked PL/SQL stored procedures and functions for data processes used in the Informatica mappings.
- SCRUM/AGILE Content Lead in cross-functional stand-ups, groomings, and retrospectives (JIRA).
- Tactical Execution of messaging and product content to meet aggressive timelines for deliverables.
- Worked on extracting data from different sources likes IBM MQ, Oracle, MS Access, Flat files.
- Create and Maintain Teradata Tables, Views, Macros, Triggers and Stored Procedures.
- Coding usingTeradataAnalytical functions, write UNIX scripts to validate, format and execute the SQLs on UNIX environment.
- Used Data Stage Designer for developing various ETL jobs to extract, cleansing, transforming, integrating and loading data
- Development of informatica B2B script to handle unstructured data from flat files, Relational sources like Oracle SQL server, Excel etc.
- Created mappings using informatica as per 3rd normal form design model.
- Configured and used B2B data exchange for end to end data visibility through event monitoring and to provide a universal data transformation supporting numerous formats, documents, and filters.
- Led SOA based integration for scaling it to next level where it could support 4 million users
- Extensively dealt with the performance issues and made necessary coding changes for improving the system Performance.
Environment: Informatica Power Center 8.6, Teradata, informatica power exchange 8.1, Oracle 10g, PL/SQL, Flat Files, AGILE METHODOLOGY, SQL Server 2007, ERWIN, SQL Workbench, Toad 9.1, UNIX.
Confidential
Software/SQL Programmer
Responsibilities:
- Used Informatica as an ETL tool to extract data from multiple source systems by creating mappings using various transformations.
- Developed and tested mappings, sessions, work lets and workflows.
- Developing and modifying changes in mappings according to business logic and also tuned them for better performance.
- Creating and running workflows using Workflow Manager to load data into the Target Database.
- Configured the sessions to handle the updates to preserve the existing records.
- Created Unit test scripts, checklist for Informatica Objects mappings, sessions, work lets and workflows as a part of Unit testing.
- Pulled data from the mainframe systems, imported Copybooks to informatica staging Area.
- Created linked servers between different SQL Servers and also created linked server with different access files used across various Used Data departments
- Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
- Extensively involved in Data Extraction from Mainframe source (VSAM), Oracle using Power exchangewhen there was change in the source data sets.
- Extensively used Normalizer (for COBOL source files from S-390 Mainframe system), Router, Lookup, Aggregator, Expression and Update Strategy Transformations
- Developed SQL scripts to Insert/Update and Delete data in MS SQL database tables.
- Scheduling of sessions and batches using workflow manager to load the data into target table.
- Responsible for development of robust and reliable solutions requiring low maintenance, thorough testing of solutions.
- Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
Environment: Informatica Power Center 8.5, Oracle 9i, DB2, Kbase, UNIX Shell Scripting, Mainframes, MS PowerPoint, SQL, ERWIN, IBM Data stage, PL/SQL, Win NT 4.0