We provide IT Staff Augmentation Services!

Sr Etl Developer/analyst Resume

Mclean, VA

SUMMARY:

  • Data analyst / Certified Informatica Developer with over 10 years in the Information Technology, wide range of experience in Software Development, Project Management, Data Integration, Master Data Management and Quality Assurance with Mortgage background
  • Experience in Business Analysis, Application Design, Development, Implementation for Mortgage, Health Care, Insurance, Commercial and Financial Services.
  • Strong experience in the analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Experience in various phases of IT projects Software Development Life Cycle (SDLC) such as analysis, design, coding and testing, deployment and production support.
  • Experience in Programming, Database design, development and Data Warehousing using Informatica for Extraction, Transformation, Loading (ETL) of data from multiple database sources for medium to large enterprise data warehouses
  • Experience in data modelling and reverse engineer data warehouse using tools like ER Studio.
  • Experience in Data Warehouse/Data Mart development life cycle with thorough knowledge in Star schema, Snowflake schema, Dimension and Fact tables.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star - Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation.
  • Created and implemented ER models and dimensional models (snowflake schemas).
  • Expertise in data integration, data migration, and extracting data from various data sources like Mainframe, Oracle, Netezza, DB2, SQL Server, CSV files, Text files, XML files, and Excel files.
  • Experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Informatica Power Exchange, Power Connect as ETL tool on Oracle, DB2, Netezza and SQL Server Databases.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Identify & record defects with required information for issue to be reproduced by development team
  • Performed data analysis and data profiling using SQL on various sources systems including Oracle, DB2 and SQL.
  • Hands on experience with Informatica Metadata Manager (IMM) to load metadata to get complete lineage.
  • Experience on basic Informatica administration tasks such as creating folders, users, granting permissions, optimizing Informatica server settings, configuring Informatica repository and installation on the server, etc.
  • Used Informatica Data Quality (IDQ) tool for standardizing and data profiling on a single platform that provides a centralized set of reusable rules and tools for managing data quality across any project.
  • Developed data mapping for the Slowly Changing Dimensions - SCD1, SCD2 (Flagging, Time Stamping and Version), and SCD3. Implemented the complex business rules by creating reusable transformations and Mappings / Mapplets.
  • Experienced with Informatica Power Exchange for Loading/Retrieving data from mainframe systems.
  • Experience in Dimensional Modelling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio. Experience in debugging and performance tuning of targets, sources, mappings and Sessions.
  • Database experience in using Stored Procedures, Functions, Triggers, Joins, Views and Packages in PL/SQL and Oracle and Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB, MS Access and Netezza.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Good interpersonal and communication skills, and experienced in working with senior level managers, business people and developers across multiple disciplines.

TECHNICAL SKILLS:

Hardware /OS:: Windows XP/2000/2003, Sun Solaris, UNIX, LINUX, Tomcat Apache, IIS Server

ETL Tools: Informatica Power Center (9X,8X,7X,6X), Informatica Power Exchange, Informatica Data Quality (IDQ), Informatica Metadata Manager (IMM), Informatica Web Service Console, MDM

Data Bases:: Oracle 9i/10g/11g, DB2, Sybase, ParAccel, Netezza, SQL Server 2005& 2008

Reporting Tools:: . Crystal Reports, Cognos Report, SSIS, SSRS, Business Objects, Toad 7.4, ERwin Data Modeller, Oracle Designer Professional, SQL *Plus, PL/SQL Developer, SQL Developer, Test Directory, Attunity, Data Mirror, Autosys, Control-M, Oracle Designer 12.0 and ER Studio 8.5.3.

Languages:: SQL, PL/SQL, Shell Scripting

Methods: Performance Tuning, Logical Data Modeling, Dimensional Modeling, Business Intelligence Reporting, ERwin Relational Diagramming, Data Governance, Business, Data Architecture

PROFESSIONAL EXPERIENCE:

Sr ETL Developer/Analyst

Confidential, McLean, VA

  • Worked in Agile SDLC methodology team to develop a common formal SDLC based on iterative and incremental development
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Involving in giving the LOE for all the tasks and assigning to the team and also working closely with the managers to make sure we are on the schedule.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation.
  • Created entity relationship diagrams and multidimensional data models, reports and diagrams for marketing.
  • Used Model Mart of ERwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Worked closely with data modeling team in building data models, architect team, admins and TCM team in the development of current and target state.
  • Used Informatica Metadata Manager (IMM) to create source to target lineage by loading metadata from all different applications and scanned all applications.
  • Documented the complex process flow as per the enterprise architecture standards to describe program development, logic, testing and integration steps in the technical design documents and system interface document based on the architecture document.
  • Performed data analysis on source system that is required for Loan APP to meet the business user.
  • Perform Proof of concept to demonstrate the use of Informatica Web service and Informatica Data Quality (IDQ) tool (data profiling, data cleansing and standardization).
  • Performed gap analysis to check the complexity of existing system infrastructure with the new business rule.
  • Involved with data analysis/profiling for multiple sources and answered complex business questions by implementing roll forward mechanism.
  • Extremely used Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, Workflow Monitor and Web Service hub.
  • Worked on Informatica partition and parallel processing techniques to improve the execution time and performance.
  • Worked on Performance tuning of various mappings and sessions to identify and remove Bottle Necks.
  • Worked on data integration projects and implemented performance tuning techniques to improve performance.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Developed complex mappings to load data from multiple source systems like Oracle, flat files, DB2 files to data mart in ParAccel database and XML files to data mart in Oracle database.
  • Implemented ETL CDC (Change Data Capture) to load incremental changed data to data mart.
  • Experience in exposing ETL jobs as Web Services so any other intranet applications can invoke the jobs for processing the data and create XML files.
  • Experience in calling java web services, calling jars utilities in power center to designer to read data from web services.
  • Responsible for design and implementation needed for loading and updating the warehouse.
  • Worked with the Integrating team and raising request for the deployment of developed code in SIT and UAT environment and supporting SIT and UAT team to resolve issue.
  • Assisted testing team in developing test scenario’s, test logic, and test data to support unit and system integration testing and executed test plans.
  • Worked on creating views in IBM Clear Case for the backup of all the developed mappings.
  • Wrote UNIX shell scripts to trigger ETL jobs, do file level validation and send email notifications to business.
  • Used Informatica Data Quality (IDQ) tool to standardize the address, zip using address validation transformations
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Used Informatica Power Exchange to connect to Mainframe for foading/retrieving data from mainframe.
  • Creating JIL Scripts for the Autosys jobs to run in allotted time for a Data Mart.

Environment: Informatica Power Center 9.6, Informatica Data Quality (IDQ) 9.6 . Informatica Power Exchange 9.6, Informatics Metadata Manager (IMM) 9.6, Informatica Web Service 9.6, Oracle 10g, SQL server 2008, PL/SQL Procedures, ParAccel, Netezza IBM Rational Clear case, GIT, DB2, MS Visio, ALM, UNIX, PUTTY, AutoSys.

Sr. ETL Developer

Confidential, Rochester, NY

  • Involving in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment
  • Converted functional specifications into technical specifications design of mapping documents (TDD).
  • Extremely used Informatica Power Center Designer client tools like Source Analyzer, Target Analyzer, Mapping Designer and Mapplet Designer.
  • Created Mapplets with the help of Mapplet Designer and used those Mapplets in the Mappings. Created reusable transformations by using Lookup, Aggregator, Normalizer, Update strategy, Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer and Mapplet Designer, respectively.
  • Used different Transformations like Address Validation Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Rank, Update Strategy in Data Quality tool and exported to Power Center as a Mapplets and used in the mapping.
  • Created mapping in Informatica Data Duality(IDQ) center and defined rules and imported this mapping to power center.
  • Used Informatica Data Quality (IDQ) tool to standardize the address, zip using address validation transformations
  • Used Control-M for schedule jobs and used Table Load scripts to load tables.
  • Responsible for design and implementation needed for loading and updating the warehouse.
  • Created scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from file names, such as date, for continuously incoming sources.
  • Wrote UNIX shell Scripts, Perl Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Experienced with Informatica Power Exchange for Loading/Retrieving data from mainframe systems.
  • Created deployment groups to deploy objects and used MKS to deploy SQL scripts.
  • Informatica workflow manager was used to create, schedule, execute Sessions, Worklets, Command, E-Mail Tasks and Workflows. Performed validation and loading of the Flat files received from business users.
  • Worked on PL/SQL database writing stored procedures, triggers, views, indexes to extract the data as per requirements.
  • Writing PL/SQL Procedures, Functions and Packages to meet the module requirements.
  • Fixed invalid Mapping's, tested Stored Procedures and Functions, Unit Testing of Informatica Sessions, Workflows.
  • Used Parameter files to reuse the mapping with different criteria to decrease the maintenance.
  • Extensively used Informatica Debugger for testing the mapping logic during Unit Testing and also worked with QA teams regarding fixing bugs.

Environment: Informatica Power Center 9.x, Informatica Data Quality (IDQ), Informatica Power Exchange, Oracle 10g, SQL server 2008, PL/SQL Procedures, TOAD, Netezza, Rational Clear case, Control-M, MKS Integrity Client, Sybase, MS Visio, ALM, UNIX, PUTTY

Sr. ETL Developer

Confidential

  • Involving in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment
  • Responsible for giving production support during deployment and implementing the changes into Production.
  • Involved in writing T-SQL Procedures Using SQL Server
  • Imported Teradata objects to Informatica power center.
  • Widely involved in the developing & debugging of Sybase and SQL server procedures.
  • Gathered requirement and prepared mapping documents and the data flow documents.
  • Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Used different Transformations like Address Validation Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Rank, Update Strategy in IDQ and exported to Power Center as a Mapplets and used in the mapping.
  • IDQ is used for data cleansing and profiling
  • Informatica Data Quality helps make data quality improvement and standardizing on a single platform that provides a centralized set of reusable rules and tools for managing data quality across any project
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
  • Experience analyzing user requirements and translating them into system data structure designs.
  • Solid Expertise in using both connected and unconnected Lookup Transformations.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
  • Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev environment.
  • Used Debugger wizard to troubleshoot data and error conditions.
  • Developed Reusable Transformations and Reusable Maplets.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
  • Responsible for Unit Testing of Mappings and Workflows.
  • Updated UNIX shell scripts to trigger jobs, move files, DMC check, email notifications.

Environment: Informatica Power Center 9.1.0, Oracle 10g, Teradata, TOAD 9.0, PL/SQL, Informatica Data Quality(IDQ), MS SQL Server 2005, UNIX and PUTTY.

ETL Developer

Confidential

  • Involving in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment
  • Interacted with the source system to gather the requirements.
  • Gathered requirement and prepared mapping documents and the data flow documents.
  • Worked with power Center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Experience working with Cognos 10, to generate reports.
  • Experience working with Mainframes to writing JCL to kick off the workflows.
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Maplet Designer and Mapping Designer.
  • Experience analyzing user requirements and translating them into system data structure designs.
  • Extensively used Informatica Transformation like Source Qualifier, Rank, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
  • Solid Expertise in using both connected and unconnected Lookup Transformations.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Updated existing Unix Shell Scripts to run the jobs.
  • Involved in writing T-SQL Procedures Using SQL Server.
  • Wrote PLSQL blocks, Procedures, Functions & Packages.
  • Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
  • Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev environment.
  • Involved in the tuning of SQL Scripts to load data faster.
  • Developed Reusable Transformations and Reusable Maplets.
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
  • Worked with Shortcuts across shared and non-shared folders.
  • Responsible for Unit Testing of Mappings and Workflows.
  • Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.

Environment: Informatica Power Center 8.6, Informatica Power Exchange 8.x, Mainframes, Oracle 10g, PL/SQL, TOAD, PLSQL developer, UNIX, Cognos 10 and PUTTY.

Hire Now