We provide IT Staff Augmentation Services!

Sr.data Solution Developer Etl Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Over 6+years of IT experience in Analysis, Design, Development and Implementation of Relational Database management systems and Enterprise Data Warehousing Systems
  • Extensive ETL experience in Enterprise Data Warehousing, Data Architecture, Data Modeling, Data Mining, Data Analysis and Decision Support System (DSS).
  • Strong experience in Data warehouse development life cycle and Design of Data marts with Star, Snowflake schema and 3rd Normal Forms.
  • Experienced in multiple industries including Insurance, Banking, Finance and Health Care.
  • Extensive experience in using Informatica Power Center versioning 9.6.1/9.5.1/8.x to carry out the Extraction, Transformation and Loading processes.
  • Experience in Informatica administration of Informatica PowerCenter. Informatica Administration activity(by using Admin console)
  • ETL Application Registration (Fully Hosted/Evaluation Only/Review the configuration file/Create Unix File system/Create an opportunity in SFDC). EBIS Web Calendar and status Report.
  • Informatica SCM(Folder Level migration/Object Level Migration).Informatica object Migration using Repository Manager
  • Extensively worked on Informatica Components like Designer, Workflow manager, Workflow monitor and Repository manager.
  • Worked with Informatica Server and Client tools, experience in the Data Analysis, Design, Development, Implementation, Testing, Production Support of Database/Data warehousing /Legacy applications for various industries using Data Modeling, Data Extraction, Data Transformation and Data Loading.
  • Strong Working Experience of PowerCenter Administration, Designer, Informatica Repository Administrator console, Repository Manager and Workflow Manager.
  • Worked on Administration of Informatica PowerCenter and Ascential DataStage.
  • Installation of Informatica patches and upgrades, user access management, deployment activities, on - call support and capacity planning.
  • Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, FACT and Dimensional Tables, Physical and Logical Data Modeling using Erwin.
  • Extensive experience in designing and developing various complex mappings using transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter, joiner and sorter.
  • Extensive experience in developing Mappings, Mapplets, Sessions, Worklets, Workflows and configuring the necessary connections in Informatica Power Center.
  • Experience in identifying bottlenecks in ETL Processes, improving the performance of the production processes by implementing Database Tuning techniques like Partitioning, Index Usage, Aggregate Tables, parallel processing and Normalization strategies.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
  • Expertise in several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Data Quality, and lookup tables and ETL data movement.
  • Experience in Business Glossary Administration using Informatica Business Glossary 10.0.
  • Extensively worked in Agile mode, worked on IBM Rational Tool to update stories and tasks.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Strong programming experience in SQL,MSQL, PL/SQL, Stored Procedures, Functions, Packages, Constraints, Collections, Triggers and Indexes.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting and Autosys.
  • Strong experience working with different RDBMS like Oracle, SQL Server, Teradata, DB2 and with different file systems like Flat Files, COBOL VSAM files and XML Files both as Source and as well Target.
  • Experienced in Design, Development and Maintenance of Data Marts.
  • Experience in designing E-R diagrams, Logical and Physical database designs using Erwin and Visio.
  • Experience in understanding the Business requirements and translating them into Detailed Design Documents along with Technical specifications.
  • Created the Unit Test Case document and performed testing like Unit Testing (UT) and System Integration Testing (SIT).
  • HP Application Lifecycle Management(HPALM ) and HP Quality Center Enterprise for Informatica testing.
  • Hands on experience in all aspects of Software Development Life Cycle (SDLC).
  • Excellent communication and interpersonal skills.

TECHNICAL SKILLS:

ETL Tools: Repository Server Administrator Console,Informatica Power Center 9.6.1/9.5.1/8.x/7.x, IDQ V 9.X/8.X, Informatica developer IDQ 9.1/9.5.1,Informatica B2B Data Exchange, B2B Data Transformation, ODI 12c/11g,HPALM(Testing tool)

Hadoop Ecosystems: Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Cassandra, Oozie, Flume, Mongo DB.

Reporting Tools: Knowledge on OBIEE, Tableau,Business Objects 4

Version control: Perforce

Databases: Oracle12c/11g/10g, MSSQLServer2014/2012/2008, Teradata14/13/12, PosrgreSQL, DB2 UDB v8, Netezza, HBase, Mongo DB.

Languages: T-SQL, PL/SQL, HTML, Unix Shell Scripting, C, JAVA, XML, Perl, Python

DB Tools: Toad, SQL* Loader, SQL server management studio, SQL Developer

Modelling Tool: ER Studio, ErWin

Scheduling Tools: Autosys 4.5, Control-M, Tivoli

Desktop App: Microsoft Office Suite (Word, Excel, Power Point, Access, Outlook)

Operating Systems: Windows 7, Windows Server 2010, UNIX, LINUX

PROFESSIONAL EXPERIENCE

Confidential,

Sr.Data Solution Developer ETL

Responsibilities:

  • Design and Develop ETL programs primarily using Informatica PowerCenter, PL/SQL, Unix Shell Scripts
  • Extract, transform & load into dimension model structures of EDW & data marts
  • Performance improvement & tuning of Informatica mappings, SQL & PL/SQL scripts
  • Strong SQL skillset with demonstrated experience on SQL Server, Oracle, DB2
  • Technical reviews, Data validation & end to end testing of ETL Objects, Source data analysis and data profiling.
  • Create necessary documentation as per SDLC standards
  • Update the project documentation (Change Requests and JIRA’s) on the confluence
  • Upload the code to version control tool (GIT)
  • Perform Unit Testing of ETL Programs and support QE on End-to-End testing
  • Work with deployment team in the code migration across the SDLC environments
  • Triage the Production issues and work on the code fixes
  • May participate in a 24x7 rotational on-call responsibility

Confidential, Charlotte, NC

Sr. ETL Informatica Developer and Administrator

Responsibilities: .

  • Interacting with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project
  • Working on dimensional modelling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models as per business requirements using Erwin.
  • Understand the existing subject areas, source systems, target system, operational data, jobs, deployment processes and Production Support activities.
  • Design queries for marketing data marts and then figure out how to make these queries yield comparable data.
  • Discover the source data that causes the problems downstream.
  • Analyzed the data models of the source & target systems to develop comprehensive mapping specifications.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Good understanding in entity relationship and Data Models.
  • Working with Power Centre Designer tools in developing mappings and Mapplets to extract and load the data from flat files, XML files and Oracle (source) and loaded into Oracle (target)
  • Creating different transformations like Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, Sequence Generator for loading the data into targets.
  • Creating mappings with different look-ups like Connected look-up, Unconnected look-up,Dynamic look-up with different caches such as persistent cache etc
  • Creating Workflows with tasks such as sessions, event raise, event wait, command task, decision task, decision tasks, e-mail tasks, link task and scheduling(Informatica scheduler tool) etc. for running the Informatica mappings.
  • Developing complex mappings such as Slowly Changing Dimensions(SCD)Type1,Type 2,Type 3Time Stamping in the Mapping Designer.
  • Creating sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Variables and Parameters in the mappings to pass the values between sessions.
  • Involved in migration of the maps from IDQ to power center
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.
  • Involving in Glossary administrators Manage users and roles to control the glossary assets that the users can access. Import or export glossaries, Modify business term templates or policy templates. Using Informatica Business Glossary 9.6.1
  • Involving in Workspace Glossary Security with Glossary Security, Library, Glossary
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
  • Executing sessions, sequential and concurrent batches for proper executionof mappings
  • Involved in Preparing migration document to move the objects from development to testing and then to production repositories.
  • UNIX shell scripts were used for merging files, data validation, load validation and archiving offiles by using the command tasks of Informatica. Creating mapplets to use them in different mappings.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Written SQL Queries to define identify and validate the code written for the data movement into the database tables, fine-tuned the queries for better performance.
  • Creating UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Developing Shell scripts for initial conversion of dimensions, for validation of source files and data loading procedures, written multiple scripts for pre-processing the files.
  • Used the Target Load Ordering with Stored Procedures to update database.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Performing Code reviews, Unit testing and actively involved in Defect management process for any ETl fixes an retest .

Environment: Informatica Power Center 9.6.1, SQL*Loader, Oracle 12c/11g, ESP, Rational Clear case, Informatica IDQ, Rational Clear Quest, Windows XP, TOAD, UNIX, Share Point, Java, J2EE, DB2, XML, Soap UI, SQL Server, ALM Quality Center 12.02, QTP, SQL, MS Visio, Load Runner and PowerShell .

Confidential

Informatica Developer

Responsibilities:

  • Understanding the reporting requirement and the dimensional data-marts.
  • Documenting every ETL mapping specification. Preparing documentation for source analysis & ETL design, which are required prior to development.
  • Providing input for changes in the Transparency Data Mart. Performing data analysis and working with DBA in designing tables in the Data Marts based on reporting needs.
  • Code reviews for the code developed by peer developers
  • Carried source control on behalf of the team
  • Responsible for preparing and tracking Project Plan.
  • Preparing ETL Estimates and negotiating the same with client.
  • Create the Technical Design, high level low level design documents.
  • Create ETL specifications, test cases documents for all the mappings developed.
  • Responsible for overall quality of deliverables.
  • Delivered many re-usable components.
  • Responsible for mentoring the new resources brought into the team
  • Performed thecode reviews
  • Involved in extraction, cleansing and loading of data into DB2 UDB database from flat files and DB2 tables.
  • Identified anti-fraud, money laundering transactions making use of Actimize detection software.
  • Identifying the sequence in which the sessions were to be executed and coming up with the workflow layout.
  • Developed mappings in Informatica to load the data from various data sources into the Data Marts, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Writing stored procedures to perform certain unique tasks that are complex to be implemented within Informatica Power Center.
  • Performing code reviews & testing the ETL code that was modified in connection with this conversion process.
  • Performing ETL & database code migrations across environments.

Environment: Informatica PowerCenter7.1.2/6.1.3, Teradata, Oracle9i, Informatica Metadata Reporter, PL/SQL, SQL* Loader, Oracle 10g, Oracle Enterprise manager, Toad, Sun Solaris, Unix Shell Scripting, Teradata SQL Assistant, Teradata external loaders, Actimize, Windows-NT, IBM OS/390, MVS, DB2, VSAM, JC

We'd love your feedback!