Sr. Etl Developer Resume
Peoria, IL
SUMMARY:
- Around 8+ Years of IT experience in developing/Testing/Validating Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center, Power Exchange, HP Quality Centre, ALM etc.
- Experience in Installation, Configuration and Administration of Informatica Power center 7.x/6.x /8.x/9.x Client, Server in Windows and UNIX.
- Full Software Development Life Cycle (SDLC) experience, involved in requirement model analysis, design, development, testing, and maintenance with working experienced in Agile, Scrum and Waterfall environments.
- Highly experienced in Extraction/Transformation/Loading of the legacy data to Data warehouse using ETL Tools.
- Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Integration, Data Import and Data Export using multiple ETL tools such as Informatica Power Center and Data Stage.
- Experience working with SDLC, RUP, Waterfall and Agile methodologies
- Involved in Test planning and Effort estimation. Responsible for Test status reporting and documentation. Single Point Responsibility for releases, defects fixes.
- Excellent understanding of ETL, Dimensional Data Modeling techniques, Slowly Changing Dimensions (SCD) and Data Warehouse Concepts - Star and Snowflake schemas, Fact and Dimension tables, Surrogate keys, and Normalization/Denormalization.
- Designing and executing unit test plans and gap analysis to ensure that business requirements and functional specifications are tested and fulfilled.
- Used various Transformations such as Expressions, Filters, aggregators, Lookups, Routers, Normalizer, and Sequence Generator etc. to load consistent data in to Oracle, and Teradata databases.
- Having good knowledge in Normalization (1NF, 2NF and 3NF) and De-Normalization techniques for optimum performance on XML data, Relational and Dimensional databases environment
- Expertise in SQL/PLSQL programming, developing & executing Packages, Stored Procedures, Functions, Triggers, Table Partitioning, Materialized Views.
- Having good experience in Teradata sql assistant. Extracted data from Teradata and loaded into various other target systems. Also used Teradata as target instances.
- E xtensive success in translating business requirements and user expectations into detailed specifications employing Unified Modelling Language (UML)
- Developed Dynamic packages using SSIS and Implemented Incremental Data Loading.
- Experience in Performance Tuning of sources, targets, mappings, transformations and sessions and experience in creating CDC Datamaps using Power exchange and achieved incremental loading into target tables.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow flake Schemas.
- Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.
- Involved in working with Power Exchange sources and extracted non-relational data into Informatica via Datamaps etc.
- Experience with Type 1, Type2, Type3 Dimensions
- Experience with Teradata as the target for the data marts. Worked with BTEQ, FastLoad and MultiLoad.
- Experience in Integration of various data sources like Oracle, Teradata, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV and excel.
- Expertise in error handling and reprocessing of error records while extraction and loading of data into enterprise ware house objects.
- Good hands on creating, modifying and implementing Unix shell scripts for running Informatica workflows, preprocessing and post processing validations etc.
- Good knowledge on schedule jobs in Autosys/Control-M and defining clear and perfect interdependency between the jobs to achieve what we needed.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
TECHNICAL SKILLS:
ETL Tools: Informatica 6.x/7.x/8.x/9.x, Power Exchange 8.6.1, Datastage8.1, Datastage8.5
Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling Using Erwin Tool.
Testing Tools: HP Quality Center, Silk Test, SCM tools etc.
RDBMS: Oracle 11g/10g/9i/8.x, SQL Server 2008/2005/2000 , MySQL, DB2, MS Access
Database Tools: TOAD, SQL*PLUS, SQL Developer, SQL*Loader, Teradata SQL Assistant
Languages: SQL (2012 and 2014), PLSQL, Shell Scripting, C, C++, PERL, PYTHON.
Operating Systems: MS-DOS, Windows7/Vista/XP/2003/2000/NT, UNIX, AIX, Sun Solaris
PROFESSIONAL EXPERIENCE:
Confidential, Peoria, IL
Sr. ETL Developer
Responsibilities:
- Identify the sources and analyze the source data.
- Created and stored metadata in the repository using Informatica Repository Manager.
- Cleanse the source data, Extract and Transform data with business rules, and built re-usable mappings, using Informatica PowerCenter Designer.
- Involved in creating, editing, scheduling and deleting of sessions using workflow manager in Informatica
- Worked with Cognos Developers during requirements gathering
- Implemented various Data Transformations using slowly changing dimensions
- Monitored workflows and collected performance data to maximize the session performance
- Provided fixes for critical bugs and assisting code moves to QA.
- Extensively worked with SQL queries. Created Stored Procedures, packages, Triggers, Views using PL/SQL Programming.
- Handled run time errors in SSIS packages utilizing Event Handlers and Row redirects.
- Used Teradata utilities fastload, multiload, t pump to load the data.
- Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
- Created Logical/Physical Data models in 3NF in the Warehouse area of Enterprise Data Warehouse.
- Optimized the performance of the Informatica mappings. Configured session properties and target options for better performance.
- Utilized XML and SQL Server table configuration for the management and migration of SSIS packages in staging and pre-production environments.
- Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues
- Created SSIS packages to load data into SQL Server using various transformations in SSIS.
- Implemented Error Logging using Event Handlers and implemented logging in SSIS Packages.
- Create the system to extract, transform and load market data, check correctness of data loading, use UNIX Korn shell and Perl, Oracle stored procedures.
- Deployed packages from test environment to production environment by maintaining multiple package configurations in SSIS utilizing package and project deployment models in SSIS 2012.
- Validated complex mappings involving Filter, Router, Expression, Lookup, Update Strategy, Sequence generator, Joiner and Aggregator transformations.
- Expertize in defining and documenting the business process flows (UML) like Use case diagrams, Activity diagrams, Sequence diagrams and Data Flow Diagrams.
- Used debugger to test the mapping and fixed the bugs. Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
- Created mapping variables and parameters and used them appropriately in mappings.
- Extensively used all the features including Designer, Workflow manager and Repository Manager, Workflow monitor.
- Used Informatica to load the data to Teradata by making various connections to load and extract the data to and from Teradata efficiently.
- Great Expertise in event handlers, Package Configurations, Logging, and Check points, Package Securities and User Defined Variables for SSIS Packages.
- Experience in enhancing and deploying the SSIS Packages from development server to production server.
- Designed and developed several ETL scripts using Informatica, UNIX shell scripts
- Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues.
- Developed banking management scripts in python to support the chase website in creating user profiles, transactions for the withdrawals and deposit.
- Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.
- Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.
- Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.
- Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
- Involved in conducting and leading the team meetings and providing status report to project manager.
Environment: Informatica PowerCenter 9.6.1, IDQ 9.6.1, Oracle 11g, MySQL, Oracle 11g, DB2, UML, MS SQL Server 2005, Erwin, TOAD, korn Shell scripts, PERL, PYTHON, Autosys, Tidal Scheduler, Shell Scripting, JIRA, Cognos 10.x.
Confidential, Rochester, MN
ETL Developer
Responsibilities:
- Developer and interacted with Business Analyst to understand the business requirements &Involved in analyzing requirements to refine transformations.
- Provided technical guidance for re-engineering functions of Oracle warehouse operations.
- Collaborated with architects to align the ETL design to the business case and the overall solution.
- Developing the UNIX shell scripts for master workflows.
- Design and develop new enhanced functionality for existing applications.
- Modifying scripts to handle automated Loading/Extraction and Transformation (ETL) of data using SSIS.
- Created the ETL source to target mapping documents working with the business Analysts.
- Designed, developed, and maintained Enterprise Data Architecture for enterprise data management including business intelligence systems, data governance, data quality, enterprise metadata tools, data modeling, data integration, operational data stores, data marts, data warehouses, and data standards.
- Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.
- Worked with Informatica power center 9.5 to extract the data from IBM Mainframes DB2 sources into Teradata.
- Created design standards of the ETL code by applying SCD logics.
- Created Parameter files, mapplets, and worklets for reusability in the code.
- Reviewed and maintained the ETL coding standards.
- Maintained an understanding of XML, XSD, DOM/SAX parsing, XPath and XSLT
- Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.
- Worked on the Persistent cache options where ever required to re-use the cache for larger table lookups.
- Tuned the Performance of the SSIS packages by avoid the Blocking transformation, by configuring the DefaultBufferSize and DefaultBufferMaxRows, by using ColumnStore index wherever necessary.
- Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using Erwin tool.
- Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.
- Designed ETL packages dealing with different data sources (SQL Server, Flat Files) and loaded the data into target data sources by performing different kinds of transformations using SSIS.
- Extensively used transformations like router, lookup (connected and unconnected), update strategy, source qualifier, joiner, expression, stored procedures, aggregator and sequence generator transformation.
- Created Oracle stored procedures for capturing the ETL run statistics for the daily delta loads.
- Performance tuned the database stored procedures and changed the updates to deletes and inserts to reduce the costly operations on the database.
- Used performance tuning at the session level, mapping level, and at the database level.
- Used the ‘Organize On’ Option in the Netezza tables for frequently joined tables.
- Creating DB Objects using Best practices to avoid data skews on the Objects.
- Performance tuning of SQL scripts.
- Created and executed the unit test plans based on system and validation requirements.
- Worked on the migration of the code from DEV, QA, UAT, PROD using XML migrations.
- Effectively communicate project expectations to team members in a timely and clear fashion
Environment: Informatica 9.1/9.6.1, Teradata, DB2, Oracle 11g, MySQL, PERL, PYTHON, MS SQL Server, UML, Cognos, ERWIN 9.5.02, Teradata SQL Assistant, Toad 12.1.0, Aginity Workbench 4.3, flat files, XML files.
Confidential, Detroit, MI
ETL Developer
Responsibilities:
- The business design work involved in establishing the reporting layouts for various reports and the frequency of report generation.
- Identifying the information needs within and across functional areas of the organization.
- Modeling the process in the enterprise wide scenario.
- Field mapping work involved establishing relationships between the databases Tables, filter criteria, formulas etc., needed for the reports.
- Managed database optimization and table-space fragmentation.
- Involved in full Software Development Lifecycle (SDLC)
- Responsible for developing, implementing, and testing data migration strategy for overall project in database using SQL 2012 as platform with global resources.
- Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
- Worked on Informatica Power Center 8.6 tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer .
- Developed Informatica mappings and tuning of mappings for better performance.
- Extracted data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
- Created different source definitions to extract data from flat files and relational tables for Data mart.
- Used the dynamic SQL to perform some pre-and post-session task required while performing Extraction, transformation and loading.
- Tuned the performance of queries by working intensively over indexes.
- Created reusable mapplets and Transformations starting concurrent batch process in server and did backup, recovery and tuning of sessions .
- Created, modified, deployed, optimized, and maintained Business Objects Universes using Designer.
- Designing the ETL process using Informatica to populate the Data Mart using the flat files to Oracle database
- Created complex mappings to populate the data in the target with the required information.
- Created work flows and sessions to perform the required transformation.
Confidential
ETL Developer
Responsibilities:
- Worked with Business Analyst in requirements gathering, business analysis and project coordination.
- Responsible for developing complex Informatica mappings using different transformations.
- Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
- Responsible for ETL Operation which extracts and transforms the data to Data Warehouse using SQL Server Integration Services (SSIS)
- Created and Monitored Sessions and various other Tasks such as Event-Raise Task, Event-Wait Task, Decision Task, Email Task, Assignment Task, Command Task etc. using Informatica Workflow.
- Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and usage of workflow variables for triggering emails.
- Responsible for ETL Operation which extracts and transforms the data to Data Warehouse using SQL Server Integration Services (SSIS)
- Implemented Error Logging using Event Handlers and implemented logging in SSIS Packages.
- Responsible for tuning the Informatica mappings to increase the performance.
- Implemented complex ETL logic using SQL overrides in the source Qualifier.
- Performed Unit tests development work and validates results with Business Analyst.
- Developed Unix Scripts for updating the control table parameters based on the environments.
- Responsible for providing written status reports to management regarding project status, task, and issues/risks, testing.
- Anlayzing requirements to create test cases and getting approval from client for execution.
- Used Defect Tracking tools such as ATLAS etc for proper management and reporting of defects identified.
- Written various SQL’s for validating test data and production data from sources systems before loading for performance testing/ UAT.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.
Environment: Informatica 9.6.1 Oracle 10g, SQL, PL/SQL, MySQL, Teradata, TOAD, Shell Scripts, UNIX (AIX), Autosys, XSLT, MQ Migration Tools, Defect Tracking Tools (ATLAS).