We provide IT Staff Augmentation Services!

Data Movement Engineer (etl/sql Developer) Resume

PROFESSIONAL SUMMARY:

  • 7+ years of IT experience in Software Analysis, Design and Development for various software applications in client - server environment in providing Business Intelligence Solutions in Data Warehousing and various technologies, with domain knowledge of Health Care & Pharmacy, Industries for Decision Support Systems, Banking and Finance, Insurance.
  • 7+ years of Data Warehouse, Data mart, Data Integration and Data Conversion Projects ETL experience using Informatica Power Center9.x/ 8.x, Data Quality Analysis, Data Profiling.
  • Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.
  • Expertise in Client-Server application development using Oracle 11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD and SQL*LOADER.
  • Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.
  • Strong experience in Data warehouse concepts, ETL.
  • Good knowledge on logical and physical Data Modeling using normalizing Techniques.
  • Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Extensively used ETL methodology for supporting of Extract, Transform, and Load environment using Informatica Power Center 9.x/8.x(Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor).
  • Experience in understanding business requirements and translating them to ETL code by working along with business analysts to identify and study the requirements.
  • Extensive experience in creating & implementing the complex business rules by creating transformations, re-usable transformations (Expression, Aggregator, Filter, Connected and Unconnected Lookup’s, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Extensively used Enterprise Data warehousing ETL methodologies for supporting data extraction, Transformation and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center.
  • Experience on creation, execution, testing and debugging of mappings, mapplets, sessions, tasks, worklets and workflows in UNIX and Windows environment. Strong in using workflow tasks like Session, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-session and Pre/Post commands.
  • Solid experience in Data Modeling, Star Schema, Snowflake Schema, FACT tables, Dimension tables.
  • Hands on experience on creating complex SQL queries at database level to improve the session performance.
  • Experience in Optimizing the Performance of SQL scripts and Oracle database/application tuning,
  • Experience in different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements.
  • Extensive experience in full life cycle development with emphasis on Project Management, User acceptance Programming and Documentation.
  • Experience in writing UNIX Shell scripts, SQL Scripts for Development, Automation of ETL process, error handling and reporting purposes..
  • Experience in integration of various data sources like Flat files, Oracle, SQL server, SAP and DB2 into staging area.
  • Used stored procedures for truncating the tables and deleting the rows from error logs.
  • Involved in 24x7 Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Daily Loads and Monthly Loads.
  • Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from different servers.
  • Maintained outstanding relationship with Business Analysts and Business Users to identify information needs as per the business requirement.

TECHNICAL SKILLS:

Databases: Oracle 11g/9i, DB2/UDB, Microsoft SQL Server 2008, Sybase, MS Access.

ETL Tools: Informatica Power Center 9.x/ 8.x/7.x, Power Mart 8.x/7.x,Power Connect for ERP and Mainframes, Power Exchange 8.x, Data Quality Analysis.

Reporting Tools: Business Objects Developer Suite 5.1.

Languages/Utilities: SQL, PL/SQL, UNIX shell scripts, XML.

Operating Systems: UNIX (Sun Solaris, LINUX, HP UNIX, AIX), Windows NT/98/95/2000 & Windows XP.

Other Tools: Toad, SQL*Loader, SQL Server Management Studio, MS Visio 2007, control M, Win SQL, Putty, SharePoint, PeopleSoft Financials 8.x/9.x.

PROFESSIONAL EXPERIENCE:

Confidential

Data Movement Engineer (ETL/SQL Developer)

Responsibilities:

  • Actively involved in project planning discussions and designed data flow diagrams for development of the data flow from start to end.
  • Designed and Created tables at DataMart level, defined and created unique keys, constraints at table level for data handling.
  • Created Views at Database level for the tables in order to reduce complex join conditions and data check’s at Informatica level for delta loads and to improve the ETL job performance, tested Stored Procedures, Cursors, Functions and Packages using PL/SQL.
  • Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Developed materialized views for data replication in distributed environments.
  • Created re-usable stored procedures and Pl/Sql scripts for key’s generation at Database level.
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables, loaded Data into Oracle Tables using SQL Loader.
  • Extracted data from different source systems like DB2, SQL Server, XML files, and flat files and loaded into different target systems like DB2, SQL SERVR, XML files, and flat files.
  • Extracted inbound XML files from BenefitFocus Inc. and loaded into SQL server tables and developed internal process to load data from SQL server to DB2 database for GEN application and finally extracted data from DB2 tables and generated outbound hierarchal XML files.
  • Created various active and passive transformations like Source Qualifier, Lookup, Router, Normalizer, Aggregator, Filter, Joiner, Expression and standard/reusable logic using Informatica used transformations like XML Generator and XML Parser to transform XML files, used SQL server XML data type to store XML files.
  • Coded and Created Informatica job’s for application rationalization purpose and replaced Mainframe job’s with Informatica.
  • Created re-usable java transformation scripts for data pivoting at Informatica level.
  • For Mainframe ETL Job inventory process queried repository database to find out workflow‘s, session’s and mapping which are triggering by mainframe TWS distribution.
  • Using Informatica Metadata query tool extracting and documenting workflows with Run statistics for at least previous 2 years in order to determine whether the Workflow is active or we can shut down for good.
  • Working with business user to get the file transfer requirements and placing that with sterling or lockbox.
  • Creating ETL job’s instead of COBOL jobs, replacing TWS mainframe with normal TWS distribution.
  • Created reusable mapplet’s for complex scenarios which need to use in multiple mappings.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Created View’s at database level to join and get data from multiple tables
  • Tuned the mappings to enhance performance after the mappings functionality was validated.
  • Created dynamic parameter files for workflows.
  • Created workflow variable and mapping variables to send mapping parameter and variable values from one session to another session in same workflow.
  • Created UNIX script for pre session commands to move files from one dir. to another dir. on server and post session commands to attach statics after successful session run and for archiving files.
  • Worked on configuring the Java class path in the session properties while using third-party Java packages, built-in Java packages, and custom Java packages in a Java transformation and also in Java SDK Custom transformation.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance.
  • Actively involved in code migrations and testing and Integration testing and user acceptance testing.
  • Extensive experience in Production support and addressing issues and resolving issues on timely basis without any delay.

Environment: windows7, Informatica Power Center 9.1/9.5,Mainframe,DB2, SQL Server, UNIX, XML files, WinSQL, Putty, WinScp, SharePoint, TWS, Informatica MetaQuery tool.

Data Analyst

Confidential

Responsibilities:

  • Conducted user interviews and data analysis review meetings.
  • Defined key facts and dimensions necessary to support the business requirements along with Data Modeler.
  • Created draft data models for understanding and to help Data Modeler.
  • Resolved the data related issues such as: assessing data quality, data consolidation, evaluating existing data sources.
  • Manipulating, cleansing & processing data using Excel, Access and SQL.
  • Performed Data Validations using SQL developer.
  • Responsible for loading, extracting and validation of client data.
  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models.
  • Analyzed the source data coming from various data sources like Mainframe & Oracle.
  • Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
  • Managed, updated and manipulated report orientation and structures with the use of advanced Excel functions including Pivot Tables and V - Lookups.
  • Tested the data using the Logs generated after loading the data in to Data warehouse.
  • Prepared Traceability Matrix with requirements versus test cases.
  • Worked on Master Data Management (MDM) for maintaining the customer information also for the ETL rules to be applied.
  • Involved working on different set of layers of Business Intelligence Infrastructure.
  • Worked extensively in Data consolidation and harmonization.
  • Meeting with user groups to analyze requirements and proposed changes in design and specifications.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.

Environment: Oracle 11g, SQL Server 2012 and 2014, DB2 UDB, DB2, PL/SQL, Erwin 4, TOAD, MS Access, MS Excel.

Confidential

Business Analyst

Responsibilities:

  • Interacted with Business Analysts and requirement analysts to gather business requirements and created design documents.
  • Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
  • Involved in the continuous enhancements and fixing of production problems.
  • Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
  • Partitioned the fact tables and materialized views to enhance the performance, extensively used bulk collection in PL/SQL objects for improving the performing.
  • Extensively worked on Flat files and XML files and databases and Extracted, transformed and load data from various sources such as Flat files, DB2 and loaded to various targets such as flatfiles,DB2 tables, transferred data to in the XML format.
  • Extensively used SQL Queries for data validation and overrides in Informatica
  • Designed and developed complex Aggregate, Join, Union, Router and Lookup transformation rules (business rules) to generate consolidated data identified by dimensions using Informatica ETL tool.
  • Worked on Informatica Power Center tools-Source analyzer, Target Designer, Mapping Designer and Transformation developer, designed and developed Informatica Mapping for data load and data cleansing from multiple source systems to target, developed several complex Mappings, Mapplets and Reusable Transformations to facilitate One time, Weekly, Monthly and daily loading of Data
  • Worked with Scheduler to run session on daily basis and send pre and post session emails to communicate success or failure of session after the completion.
  • Worked on Parameter files, UNIX shell scripts to add Header & Trailer, sorting, merging flat files.
  • Created the UNIX scripts for checking the source files, generating the parameter files, loading the Data from flat files to tables and creating the archive files.
  • Coordinated with source system owners, monitored day-to-day ETL progress, performed Data
  • Design and Implemented SCD methodology including Type 2 to keep track of historical data.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
  • Performed day to day migrations of various Informatica objects using deployment groups and copy-wizard option.
  • Created UNIX Shell scripts for data file handling and Archive process and called as pre session and post session commands.
  • Developed Documentation for all the routines (Mappings, Sessions and Workflows).
  • Performed Unit, Integration and system testing and provided UAT support to business partners.
  • Implemented SCD methodology including Type 2 to keep track of historical data.

Environment: windows7, informatica Power Center 9.1/8.6, Oracle11g, DB2, Sybase, UNIX, FACETS, XML, Cybermation, win SQL, Putty, and SharePoint

Confidential

Informatica(ETL) Developer

Responsibilities:

  • Worked with project managers, design lead and solution architect to achieve business and functional requirements and created ETL mapping and design documents.
  • Understanding Logical and Physical model for Staging and Target data base, created technical document for ETL process and Design documents for each module.
  • Used SQL Server and ETL tools to build high performance data integration solutions including extraction, transformation and load packages for data warehousing. Extracted data from the XML file and loaded it into the database.
  • Designed and developed Oracle forms & reports generating up to 60 reports.
  • Performed modifications on existing form as per change request and maintained it.
  • Used Crystal Reports to track logins, mouse overs, click-through, session durations and demographical comparisons with SQL database of customer information.
  • Worked on SQL*Loader to load data from flat files obtained from various facilities every day. Used standard packages like UTL FILE, DMBS SQL, and PL/SQL Collections and used BULK Binding involved in writing database procedures, functions and packages for Front End Module.
  • Used principles of Normalization to improve the performance. Involved in ETL code using PL/SQL in order to meet requirements for Extract, transformation, cleansing and loading of data from source to target data structures.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and other database related issues.
  • Maintained versions of mappings, mapplet, workflows, sessions, documentation using PVCS.
  • Involved in the creation of oracle Tables, Table Partitions, and Indexes.
  • Identified and tracked the slowly changing dimensions Type 1 and Type 2 (for change data capture), heterogeneous Sources and determined the hierarchies in dimensions.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
  • Worked on database connections, SQL joins, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level.
  • Most of the transformations were used like the Source qualifier, Aggregators, Connected & unconnected lookups, Union, Router, Update Strategy, Filter & Sequence generator
  • Used PL/SQL scripts to automate the process of creating and dropping indexes before and after the ETL process.
  • Extensively worked in operational support and deployment to Test,
  • Used Informatica Server Manager to create, schedule, monitor sessions and send pre and post session emails to communicate success or failure of session execution.
  • Daily Data Validation and data quality checks at database level using SQL scripts.
  • Unit and integration testing for the changes done as a part of enhancements. Coordinating debugging effort with testing team
  • Tracking and resolving of defects.
  • Used Informatica Server Manager to create, schedule, monitor sessions and send pre and post session emails to communicate success or failure of session execution.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed using Autosys. Debugged the mapping of the failed session.
  • Improved the overall performance by tuning the whole ETL process.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL PERL scripts to automate the process.

Environment: Informatica PowerCenter8.6.1, Oracle9i, PL/SQL, SQL Server 2008, UNIX Shell Scripting, Windows XP, SQL Assistance, Erwin 3.5.2, AIX, Shell Scripting.

Hire Now