We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Birmingham, AlabamA

SUMMARY:

  • 9+years of experience in Data Architect and Data Modeling, Data Development, Implementation and Maintenance of databases and software applications. Strong business analysis skills to find the business problem, process, management and possesses the ability to work individually as well as Team.
  • Strong background in Database Development and ETL Data Warehousing using Informatica Power Center and Informatica Data Quality (IDQ) and Bigdata Tools.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources such as XML, flat files, CSV files, relational Databases etc. into a common analytical Data Model.
  • Expertise in Data Profiling and Data Cleansing using Informatica Developer Tool and having experience in using Address Doctor as Address Validator transformation to cleanse address data.
  • Expertise in implementing Labeler and Standardizer transformation in IDQ for cleansing data.
  • Extensively worked on the development of complex mappings using various transformations like Expression, Connected/Unconnected Lookup, Filter, Router, Joiner, Union, Aggregator, Update Strategy, SQL, Transaction control etc.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, and loaded data into HDFS.
  • Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fast load, Multiload, TPT, SQL Assistant, DDL and DML commands.
  • Expertise in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer, Workflow Manager, and Workflow Monitor.
  • Conduct workshops with business stakeholders to identify the current business challenges and categorize all challenges into three categories (Business, Process, Technical).
  • Strong understanding of RDBMS and proficient in writing PL/SQL Procedures.
  • Expertise in OLTP/OLAP System, Analysis and database schemas like Star Schema and Snowflake Schema used in relational, dimensional and multidimensional modeling.
  • Experience in generating various reports using BI tool SSRS and Oracle BIP.
  • Extensively worked on Performance Tuning techniques at various levels to increase the performance and minimize the time for work.
  • Developed and maintained Perl scripts for performing various functionalities. Experience in CA workload automation scheduling ESP applications.
  • Worked with Oracle and SQL Server Stored Procedures, Triggers, Cursors, and Indexes and experienced in loading data into Data Warehouse/Data Marts.
  • Worked in Agile Development/Scrums environment and was responsible for delivering Sprint components.
  • Excellent communication and interpersonal skills. Good team player with excellent analytical and presentation skills with a strong aptitude towards learning new technologies.

TECHNICAL SKILLS:

Data Modeling Tools: ER/Studio 9.7/9.0, Erwin 9.6/9.5, Power Sybase Designer.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Cloud Platform: AWS, Azure, Google Cloud, Cloud Stack/OpenStack

Programming Languages: PL/SQL, PostgreSQL 9.4, UNIX shell Scripting, PERL.

Databases: Oracle 12c/11g, Teradata15.10/14.X/13.X/12.X, Netezza Mako4, Mako8.

Testing and defect tracking Tools: HP/Mercury, Quality Center, Win Runner, MS Visio & Visual Source Safe.

Operating System: Windows, UNIX.

ETL/Data warehouse Tools: Informatica 9.6/9.1, Informatica IDQ 10.1/10.2

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

PROFESSIONAL EXPERIENCE:

Sr. Informatica Developer

Confidential - Birmingham, Alabama

Responsibilities:

  • Developed a windows Power Shell Script to perform file level checks where source as a flat file.
  • Using Control M jobs for scheduling windows PowerShell script.
  • Developed and implemented workflow bat files in PowerShell script which will automatically trigger IDQ tool and perform IDQ level checks.
  • Prepared a C101 requirement document as per the project requirement.
  • Prepared a C204 Test case document for each scenario.
  • Prepared a P950 document for Technical design of IDQ development of Mapplets, Workflows.
  • Developed a Mapplets/Rules and parameterized the values at mapping level.
  • Working on Code reviews and assigning on Jira tickets for tracking the development of the project.
  • Attending daily standups and weekly meetings for the current project.
  • Involved in the design and implementation of Datamart to support the reporting needs of business users.
  • Perform analysis of source systems, gathering business requirements, analyzing and translating the Functional Requirements into Technical Specifications.
  • Created workflows and mappings using Informatica PowerCenter to extract the data from ODS sources thereby loading the data into target Datamart.
  • Used Informatica features to implement Type 1, 2 changes in slowly changing dimension tables and developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router.
  • Worked with developing Mapplets and Re-Usable Transformations for reusability and reducing effort.
  • Created workflows to process and load XML files received from external sources.
  • Analyzed the data from underlying sources to determine cleansing requirements.
  • Designed Workflow's using different Tasks available in Developer tool like Human Task, Inclusive Gateway, Exclusive Gateway, Notification Task, Mapping task for running tasks in Parallel/Sequence
  • Scheduled Jobs using the IDQ Scheduler by Deploying the Workflow's as an Application to the Data Integration Service for running the workflows.

Environment: Informatica Power Center 9.6.2 (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Mapping Designer, Mapplet Designer, Repository manager, Informatica Data Quality (IDQ)), Oracle 11G, IBM DB2 V10.5, IBM Cognos

Informatica Developer

Confidential - Springfield, VA.

Roles & Responsibilities

  • Involved in the design and implementation of Datamart to support the reporting needs of business users.
  • Perform analysis of source systems, gathering business requirements, analyzing and translating the Functional Requirements into Technical Specifications.
  • Created workflows and mappings using Informatica PowerCenter to extract the data from ODS sources thereby loading the data into target Datamart.
  • Used Informatica features to implement Type 1, 2 changes in slowly changing dimension tables and developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router.
  • Worked with developing Mapplets and Re-Usable Transformations for reusability and reducing effort.
  • Created workflows to process and load XML files received from external sources.
  • Analyzed the data from underlying sources to determine cleansing requirements.
  • Understood dependencies and timing issues between source systems and data warehouse operations.
  • Used the Debugger in debugging mappings to check the data flow.
  • Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
  • Create reusable failure and success email tasks.
  • Created and monitored sessions and batches using normal/bulk load options.
  • Created parameter files to change the load date and configured them at session level.
  • Involved in production support activities performing the debugging and troubleshooting incidents.
  • Created reconciliation reports to ensure data loaded into Datamart matches to the source systems.
  • Involved in unit and system testing of ETL DW jobs to ensure standard Quality Assurance practices are followed.
  • Complied with standards and guidelines related to the design, construction, testing and deployment activities as established by departmental and organizational standards.

Environment: Informatica Power Center 9.5 (Workflow Manager, Workflow Monitor, Source Analyzer, Transformation developer, Mapping Designer, Repository manager), Oracle 11g, SQL Server

ETL Developer

Confidential, Vienna, VA.

Roles & Responsibilities

  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Used SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc.) to achieve better performance.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables.
  • Designed, Developed and Build Informatica Power Center Mappings and Workflows using Teradata External Loaders.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL).
  • Involved in the Data warehouse data modeling based on the client requirements.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
  • Developed code to load the data from Flat File to stage and stage to DWH.
  • Development & Implementation of a data warehousing project and production support for enhancements and maintenance.
  • Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS.
  • Created appropriate Teradata NUPI for smooth (fast and easy) access of data.
  • Involved in Performance tuning for sources, targets, mappings and sessions.
  • Worked on Informatic Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Involved in creating views, dashboards and stories for reporting using Tableau desktop tool.

Environment: Teradata 13.x/12.x, Informatica Power Center 8.1, BTEQ, TPT, Fast load, Mload, SQL Server 2012, UNIX, Flat Files Oracle 11g, Tableau Desktop 8.3.x.

ETL Developer

Confidential

Roles & Responsibilities

  • De ve loped theinfo rma tic s mapping s forlo a dingd a ta into stag in ga nd d a tawa re hous e .
  • Implement and maintain database code in the form of stored procedures, scripts, queries, views, triggers, etc.
  • Work closely with the CTO to implement effective and maintainable database coding practices that form an architectural foundation.
  • Work with front end developers to define simple yet powerful APIs.
  • Work with GIS analysts to implement geospatial processes, queries and reports.
  • Work with DBAs to ensure efficiency of database code, integrity of data structures and quality of data content.
  • Work with product managers to ensure database code meets requirements.
  • Work with DBAs and data analysts to ensure database code is accurately documented.
  • Taking frequent back up of data, create new storage procedures and scheduled back up is one of the duties of this job.
  • Ability to work with Linux friendly applications and able to troubleshoot it when issue arises from the server.
  • Monitoring of servers is also one of the important duties of this job.
  • Designs, develops, tests and troubleshoots applicable software packages that implement complex ETL processes in accordance with business requirements and service level agreements.
  • Collaborate with application Developers, Business Partners, Management and Data Warehouse team.
  • Extends the data model to house additional structures as needed to meet applicable regulatory requirements.
  • Designs data warehouse or data mart fact tables, aggregate tables, and dimensions. Delivers quality data integration projects, standards, best practices and data dictionary.
  • Develops ETL audits and controls to ensure quality of data meets or exceeds defined standards and thresholds. Develops and documents systems, processes and logic required to expose existing data sets in the warehouse to end users for reporting and analytical purposes.

Environment: InformaticaPowerCenter 8.5,Oracle 9i, Toad, PL/SQL, Unix Shell scripting

Hire Now