Etl Developer Resume
GA
SUMMARY
- Around 8+ Years of IT Experience in analysis, design, development, implementation and troubleshooting of Data Mart Data Warehouse applications using TeraData ETL tools like Informatica power center 9.5/9.1/8.6/7.1/6.1 and power exchange 8.6.
- Strong experience in Data Warehousing (ETL & OLAP) environment and acquired excellent analytical, co - ordination, interpersonal skills, have immense leadership potential.
- Strong knowledge of Dimensional Modeling, Star and Snowflake schema. Designed Fact and Dimension Tables as per the reporting requirements and ease of future enhancements.
- Expertise in Data Flow Diagrams, Process Models and ER diagrams with modeling tools like ERWIN r 7/4/3.5 & MS VISIO 2007/2010
- Adept Confidential understanding Agile software development methodologies and framework.
- Extensive work experience in ETL processes consisting of data sourcing, data transformation, mapping and loading of data from multiple source systems into Data Warehouse using Informatica Power Center 9.5/9.1/8.6 and 7.1.
- Good Understanding of relational database management systems, experience in integrating data from various data source like Oracle 11g/10g/9i/8i, MS SQL Server 2005/2008/2012 , Teradata 13/12/V2R5, Flat files and XML into staging area
- Experience in implementingSlowly Changing dimension methodology (SCD) for accessing the full history of accounts and transaction information.
- Experience in implementing complex business rules by creating transformation, re-usable transformations (Expression, Aggregator, Filter, Connected and Unconnected Lookup, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets, Mappings, and Triggers
- Extensive experience in developing the Workflows, Worklets, Sessions, Mappings, and configuring the Informatica Server using Informatica Power Center.
- Experienced with Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, and Visual Explain).
- Extensive knowledge with Teradata SQL Assistant, TPUMP, Fast Export, Fast Load, Multiload, BTEQ, Coded complex scripts for the Teradata Utilities and finely tuned the queries to enhance performance.
- Experienced in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Expertise in writing SQL quires and developing view and stored procedures in Teradata.
- Experience in debugging, error handling and performance tuning of sources, targets, mappings and sessions with the help of error logs generated by Informatica server
- Experience in data integration, cleansing and massaging.
- Extensively used Repository Manager, Designer, workflow manager, workflow monitor Client tools of Informatica
- Experience in UNIX shell and batch scripting, Perl, FTP and file management in various UNIX/Windows environments.
- Performed unit testing Confidential various levels of the ETL.
- Experience in production support to resolve critical issues and mingle with teams to ensure successful resolution of the reported incident.
- Proficiency in writing technical documentation to describe program development, logic, coding, testing, changes and corrections.
- Excellent interpersonal skills, good experience in interacting with clients with good team player and problem solving skills.
TECHNICAL SKILLS
Languages: ANSI SQL, PL/SQL, ASP.NET, VBScript, XML, HTML, DHTML, HTTP, Shell Scripting
RDBMS: Teradata 12/13, SQL Server 2012/2008R 2/2008/2005/2000 , Oracle 11g/10g/9i/8i, DB2
Designing Tools: MS Visio 2007/2010, ERWIN r 7/4.5/4.0/3.5
Tools: and Utilities Informatica 9.6, Teradata Load Utilities (Mload, Fload, Tpump, TPT )
Operating Systems: Unix, Windows NT/XP/2003/2000,Windows 9x
PROFESSIONAL EXPERIENCE
Confidential, GA
ETL Developer
Responsibilities:
- Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
- Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.
- Involved inAnalyzing / buildingTeradata EDWusingTeradata ETL utilitiesandInformatica.
- Proficient in importing/exporting large amounts of data from files to Teradata and vice versa.
- Developed the DW ETL scripts using BTEQ, Stored Procedures, Macros in Teradata
- Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica
- Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata
- Created numerous scripts with Teradata utilities BTEQ, MLOAD and FLOAD.
- Developed mappings to load data from Source systems like Oracle to Data Warehouse
- Loading staging tables on Teradata and further loading target tables on Teradata via views
- Developed reports using the Teradata advanced analytical functions like rank, row number and window functions
- Worked on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent.
- Used various Teradata Index techniques to improve the query performance
- Created series of Macros for various applications in Teradata SQL Assistant
- Worked on Set, Multiset, Derived, and Volatile, Global Temporary tables.
- Experience in loading data into staging tables via views.
- Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL.
- Performed Tuning and Optimization for increasing the efficiency of the scripts
- Designing logical and physical database model using Erwin from high level functional, business, reporting and data mining needs.
- Physical database design in Teradata including views, indexes, Join Indexes, Aggregate Join Indexes for query optimization and building cubes on facts and dimension tables.
- Creating Semantic, logical and physical database models, creating Metadata, views, indexes, Join Indexes, Aggregate Join Indexes and related database structures using Erwin.
- Design of ETL strategies /solutions and specifications for extracting logic from different sources using Informatica 9.6 and Teradata v13 load utilities.
- Analyzing the sourcing data using informatica IDQ and Teradata warehouse manager for data profiling and to check for quality of data and cleansing and standardization needs.
- Created Fast Load, Fast Export, Multi Load, TPUMP, and BTEQ to load data from Oracle database and Flat files to primary data warehouse.
- Implemented transformations for data conversions into required form based on the client requirement using Teradata ETL processes.
- Performed job scheduling to ensure execution of certain jobs and daily updates.
- Performed several POC’s to explore various architectural solutions.
- Resolvedandclosedthe Production tickets generated due to failure of daily incremental production jobs and provided on call support for Production issues.
Environment: Teradata V13/14, Teradata SQL Assistant, Oracle 11i,Teradata View Point, Informatica 9.6, Erwin, Hadoop Eco Systems, LINUX .
Confidential, Lewisville, TX
ETL Developer
Responsibilities:
- Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
- Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.
- Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models as per business requirements using Erwin.
- Created TPT (Teradata Parallel Transporter) Script to load data from files to one of major table in the data ware house
- Created users, groups and gave read/write permissions on the respective Folders of repository.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Application Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
- Used mapping parameters and variables.
- Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
- Performed the data validations and control checks to ensure the data integrity and consistency.
- Created deployment groups in one environment for the Workflows, Worklets, Sessions, Mappings, Source definitions, Target definitions and imported them to other environments.
- Performed Upgrading Informatica Power center Version 8.6 to Informatica Power center 9.1
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
- Involved in migration of ETL code from development repository to testing repository and then to production repository.
- Worked on Teradata Global temporary and volatile tables.
- Developed processes on Teradata using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, Tpump, TPT, BTEQ (Teradata)
- Worked closely with Teradata administration team to create secondary indexes required for performance tuning of Data mart loads and reports.
- Created Teradata External Loader connections such as MLoad, Upsert, MLoad, Update, Fast Load in the Informatica Workflow Manager while loading data into the target tables in Teradata Database.
- Used set explain utility for collecting detailed SQL query plans and execution statistics using Informix.
- Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
- Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
- Worked on SQL optimization. Identified bottlenecks and performance tuned the Informatica mappings/sessions.
- Prepared Run book for the daily batch loads giving the job dependencies and how to restart a job when it fails for ease of handling job failures during loads.
- Reviewed the Testing progress and issues to be resolved by conducting walkthroughs.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
- Responsible for regression testing ETL jobs before test to production migration.
- Functioned as the primary liaison between the business line, operations, and the technical areas throughout the project cycle.
Environment: Informatica Power Center 9.1/8.6, Teradata 13, WINSCP, HPQC, Oracle 10g, DB2, Erwin r7, Flat Files, Teradata SQL Assistant, Autosys, TOAD, SQL, PL/SQL, Windows XP, UNIX and Tera Data
Confidential, Chicago, IL
Informatica Developer
Responsibilities:
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
- Extracted data from Flat files, SQL Server and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Used Informatica Designer to create Load, Update and CDC mappings using different transformations to move data to different data marts in Data Warehouse.
- Used Informatica file watch events to pole the FTP sites for the external mainframe files
- Used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.
- Worked with session logs, Informatica Debugger and Performance logs for error handling when session fails.
- Created Sessions and Batches to run Workflows.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Developed Database Triggers in order to enforce complicated business logic and integrity constraints, and to enhance data security Confidential database level.
- Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
- Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from and to different servers.
- Used Control-M to schedule ETL Jobs.
- Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
- Performed unit testing Confidential various levels of the ETL and actively involved in team code reviews.
- Created detailed Unit Test Document with all possible Test cases/Scripts.
- Conducted code reviews developed by my team mates before moving the code into QA.
- Provided support to develop the entire warehouse architecture and plan the ETL process.
- Modified existing mappings for enhancements of new business requirements.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Implemented Informatica recommendations, methodologies and best practices.
Environment: Informatica Power Center 8.1, DB2, Power exchange 8.1, Erwin r3.5, SQL, PL/SQL, TOAD, Flat files, Oracle 10g, Unix, Maestro
Confidential, Scranton, PA
Informatia Developer
Responsibilities:
- Involved in the Data Warehouse Data modeling based on the client requirement.
- Analyzed various sources, requirements and existing OLTP systems and identified required dimensions and measures from the database.
- Used Informatica Designer to create complex mappings using different transformations to move and update data and store it into a Data Warehouse according to the specifications.
- Designed and Developed several complex mappings by using various Transformations such as Lookup, Update Strategy, Router, Filter, Sequence Generator, Source Qualifier, joiner and more on the data extracted according to the business needs.
- Created Reusable transformations and Mapplets using Transformation Developer and Mapplet Designer.
- Implemented SCD methodology including Type 1, Type 2 changes to handle data loads.
- Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
- Created various Sessions, Workflows and Control Tasks to migrate data to target data warehouse tables using Workflow Manager.
- Managed sessions and Scheduled Jobs on daily or one time basis using Workflow Manager.
- Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks.
- Extensively used Debugger Process to modify data and applying Break Points while Session is running.
- Created and used SQL Queries for performing updates where Informatica is taking more time on huge dataset tables.
- Written UNIX Shell Scripts to automate the bulk load & update Processes.
- Prepared test scenarios and test cases and involved in unit testing of mappings and user acceptance testing and fixed the errors to meet the requirements.
- Provided production support involving monitoring and error handling for all the workflows involved in the application.
Environment: Informatica Power Center 7.1, DB2, Power exchange 7.1, SQL Server, TOAD, SQL, PL/SQL, Flat files, Oracle 9i, Unix Maestro
Confidential
Sofware Engineer
Responsibilities:
- Extensively worked on Informatica to extract data from Flat files, Excel files, and RDBMS tables and applied business logic to load the data into the target database.
- Designed and developed simple and complex mappings using various transformations like Aggregator, Expression, Filter, Joiner, Connected and Unconnected Lookup, Router, Source Qualifier, Sequence Generator, Update Strategy and more to implement logic on the data extracted.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Created Tasks, Workflows and Sessions to move the data Confidential specific intervals on demand using Workflow Manager.
- Provided technical support and worked on issues raised during production phase.
- Participated in weekly end user meetings to discuss data quality, performance issues.
Environment: Informatica Power Center 6.2, DB2, SQL Server, TOAD, SQL, PL/SQL, Flat files, Oracle 9i, Unix Maestro
Confidential
Software engineer
Responsibilities:
- Responsible for designing, testing and debugging new database applications and maintaining existing systems.
- Involved in the creation of database objects like Tables, Views, Stored Procedures, Functions, Packages, DB triggers, Indexes.
- Implemented integrity constraints on database tables.
- Responsible for performance tuning activities like Optimizing SQL queries, explain plan and by creating indexes.
- Worked in Database structure changes, table/index sizing, transaction monitoring, data conversion and loading data into Oracle tables using SQL*Loader.
- Performed Unit testing in development environment and had close interactions with system test and user acceptance team to complete technical and functional testing.
Environment: Oracle 7.x, PL/SQL, SQL*Plus, SQL*Loader, Visual Basic 6.0, ODBC, Windows NT, UNIX, and Shell Scripts.