We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Over 14 years of IT and around 8 years in ETL/Informatica Data warehousing experience in designing, developing, maintaining and building large business applications such as data migration, integration, conversion, data warehouse and Testing with total 14 years of experience in Information Technology.
  • Expert in all phases of Waterfall/Agile Scrum Software development life cycle(SDLC), Inmon and Kimball DW and Dimensional Modeling methodologies, Snowflake, Star Schema - Project Analysis, Requirements, Design Documentation(HLD, LLD), Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation, Production Support and Maintenance.
  • Expertise in data warehousing, ETL architecture, Data Profiling and Analysis.
  • Very strong experience in Informatica Power Center suite which includes Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Demonstrated experience with design and implementation of Informatica (IDQ v9.5), Data Quality applications for entire full development life-cycle.
  • Experience in all stages of ETL - requirement gathering, designing and developing various mappings, unit testing, integration testing and regression testing.
  • Expertise in creating Complex Informatica Mappings and reusable components like Reusable transformations, Mapplets, Worklets and reusable control tasks to work with reusable business logic.
  • Experience in power centre IDQ of developing plans for Analysis, standardization, Matching and merge, Address doctor, consolidating data from different components.
  • Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats
  • Involved in B2B, MQ, TPT loading, file system and databases as source systems and targets.
  • Experience in Teradata 14, utilities like fast export, fast load, BTEQ.
  • Good knowledge on US health care insurance system such as FACETS at DB level.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes and XML.
  • Thorough Knowledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP.
  • Intense Knowledge in designing Fact & Dimension Tables.
  • Experience in creating UNIX shell scripts, file transfers (FTP, SFTP), job scheduling and error handling. Knowledge of Unix commands for flat file manipulations with CLI and Vim Editor.
  • Knowledge in design and development of Business Intelligence reports using BI tools such as Tableau, SAP Business Objects(BO), IBM Cognos, Oracle OBIEE and MicroStrategy.
  • Experience includes thorough domain knowledge of Business Financial system, Banking, healthcare information technology, Insurance & Reinsure, Pharmacy claims systems, Telecom Industry.
  • Match and Merge rules will be implemented in Informatica MDM 10.1v to find the duplicates and to analyse the golden record.
  • Have Knowledge on Informatica MDM concepts and implementation of De-duplication process and IDD.
  • Very Strong conceptual and hands on programming skills onCollections, Multi-Threading, Garbage Collection, Exception Handling, Object Oriented Programming OOPs concepts in Core Java
  • Worked proficiently in various IDEs including Net Beans and Eclipse.
  • Experience in designing, developing and deploying J2EE applications onIBM WebSphere/WebLogic and Apache Tomcat Application Servers, and JBOSS.

TECHNICAL SKILLS

DWH Tools: Informatica PowerCenter 10/9.6/9.5/9.1/9.0/8.6/8.5/8.1/7.1/6.2/5.1 Informatica Data Quality IDQ 9.5/10 Informatica PowerExchange

BI Tools: Tableau 10, SAP BO, IBM Cognos 8, MicroStrategy, Oracle OBIEE

Databases: Microsoft SQL Server 2012/2008R2/2005/2000, Oracle 10g/9i/8i/8/7.3, Sybase, TeraData 6,13.0, 14.0, My SQL, DB 2 11.0/10.0/9.0/8.0/7.0

Languages: JAVA SE, JAVA J2EE, XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Operating System: Unix, Oracle/RHEL Linux and Microsoft Windows

Data Modeling Tools: Erwin 7.0

DB Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

Other Tools: Control M, Autosys, schedulers, Mercury/HP Quality Center, Putty, WinSCP

PROFESSIONAL EXPERIENCE

Confidential, Dallas,TX

Sr. Informatica Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the DataMart.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 7.0.
  • Developed mappings to extract data from SQL Server, Oracle, Teradata, Flat files and load into DataMart using the PowerCenter.
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.
  • Developed Slowly Changing Dimension for Type 3 SCD
  • Used mapplets for use in mappings thereby saving valuable design time and effort
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ’s standardized plans for addresses and names clean ups. Used IDQ to complete initial data profiling and removing duplicate data.
  • Used various transformations like Address validator, parser, joiner, filter, matching to develop the maps.
  • Involved in migration of the maps from IDQ to power center. Applied the rules and profiled the source and target table's data using IDQ.
  • Validation, Standardization and cleansing of data were done in the process of implementing the business rules
  • Experience in Data profiling and Scorecard preparation by using Informatica Analyst
  • Written procedures, Queries to retrieve data from DWH and implemented in DM.
  • Written shell scripts in UNIX to execute the workflow
  • Written shell scripts to run the workflows and automated them through Maestro job scheduler.
  • Written SQL Queries, Triggers, PL/SQL Procedures to apply and maintain the Business Rules.
  • Troubleshooting database, workflows, mappings, source, and target to troubleshoot the bottlenecks and improved the performance.
  • Written Indexes, primary keys and checked other performance tuning at data base level.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and database tuning.
  • Involved in generating reports from Data Marts using Cognos.
  • Defects were tracked, reviewed and analysed.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment.
  • Successfully completed Customer and Product centric Master Data Management initiatives using MDM product.
  • Defined and configured schema staging tables and landing tables base objects foreign key relationships look up systems and tables packages query groups and queries/custom queries.
  • Implement Informatica MDM including data profiling configuration specification and coding match rules tuning migration.

Environment: Informatica PowerCenter 10/9.6, IDQ 10, MDM 10x, MS SQL Server 2012/2008R2, Oracle 10g, MS Windows, Shell Scripts, Teradata 14.0, SQL, PL/SQL and Tableau 10, Erwin

Confidential

Sr. Informatica Developer

Responsibilities:

  • Designed mapping based on the Source-to-Target documents provided by the business team.
  • Attended business calls to understand requirements, huddles for status update with team.
  • Worked along with UNIX team for writing both Unix Cron Jobs and UNIX shell scripts to customize the server scheduling jobs.
  • Designed S2T mapping documents, Unit test plans, punch list and code review check list.
  • Involved in complete development of Person domain, one of the major entities of organization.
  • Involved in HL7 and EDI message transmission into data warehouse using IBM WebSphere message queue.
  • Also involved in B2B data transformations using parser and XSD for source files
  • Designed Mappings using B2B Data Transformation Studio.
  • B2B used to convert structured and unstructured data to and from more broadly consumable data formats to support B2B and multi-enterprise transactions.
  • Working on the ETL development of Dashboard application.
  • Involved in generating fastload, fastexport and B-TEQ scripts to load data in to Teradata tables.
  • Developed various DQ-Rules, web services, Physical and logical modelling with data modelers and DBAs.
  • Developed rules using IDQ to standardize the SSN, email phone number validations.
  • Used address doctor transformation to standardize the address information.
  • Worked in implementing business requirements using IDQ transformations like merge, case converter, labeller, parser, standardizer, address doctor.
  • Developed mapplets, mappings, workflows, applications and parameter files in IDQ and power center.
  • Experience in import, export of objects from IDQ to power center.
  • Exposure in deployment of objects across various environments (DEV, TEST, and PROD).
  • Involved in generating ad-hoc queries in Teradata as per business requirements also generated the DDL and DML scripts as per requirements for Teradata.
  • Experience in working with various verities of sources and targets like flat files, MQs, database objects.
  • Extensively used Power Centre to design for various exclusive transformations like MQ-series, XML transformation, expressions, joiner, router, filter, update strategy, source qualifier.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using TPT connections, pushdown optimization, partitioning, resizing caches.
  • Worked in implementing business requirements using IDQ transformations like merge, case converter, labeller, parser, standardizer, address doctor.

Environment: Informatica PowerCenter 9.1/9.0, B2B DX/DT, IDQ, Teradata, Teradata SQL Assistant, Teradata utilities like fast load, fast export, SQL server 2008/2012 UNIX Shell Scripts, Tableau.

Confidential, TX

Informatica Developer

Responsibilities:

  • Used Informatica Power Center for extraction, loading and transformation (ETL) of data
  • Developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica PowerCenter.
  • Extensively used ETL processes to load data from various source systems such as Oracle, DB2, SQL Server and Flat Files, XML files into target systemby applying business logic on transformation mapping for inserting and updating records when loaded.
  • Created complex mappings to load the data and provide support. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
  • Examine the workflow log files and assigning the ticket to the Informatica support based on the error.
  • Perform operational support and maintenance of ETL bug fixes and defects.
  • Maintain the target database in the production and testing environments.
  • Support migration of ETL code from development to QA and QA to production.
  • Design and develop Perl and Unix Shell Scripts, FTP, sending files to source directory & managing session files
  • Extensive testing and write queries in SQL, write stored procedures in PL/SQL to load data and implement business logic
  • Develop PL/SQL code at the database level for the new objects.

Environment: Informatica PowerCenter 9.1/9.0, SQL Server 2012/2008R2, Oracle 10g, Teradata 13.10, SQL, PL/SQL, UNIX Shell Scripts, OBIEE.

Confidential, KY

Informatica Developer

Responsibilities:

  • Developed ETL programs using Informatica to implement the business requirements.
  • Communicated with business customers to discuss the issues and requirements.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Used Informatica file watch events to pole the FTP sites for the external mainframe files.
  • Production Support has been done to resolve the on going issues and troubleshoot the problems.
  • Developed mappings to extract data from SQL Server, Mainframes and load into Data warehouse using the PowerCenter.
  • Developed common routine mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Made use of mapping variables, mapping parameters and variable functions.
  • Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD
  • Export/Import data from different data bases, flat files using DTS package and BCP by defining source and target.
  • Written SQL Queries, Triggers, PL/SQL Procedures, Packages and UNIX Shell Scripts to apply and maintain the Business Rules.
  • Checked performance tunning/debugging at different levels like workflows, mappings, database etc,.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment.

Environment: Informatica PowerCenter 9.0/8.6, SQL Server 2000, Mainframes, UNIX Shell Scripts, Business Objects, Cognos 8, Erwin, Autosys.

Confidential, Cambridge, MA

Informatica Developer

Responsibilities:

  • Designed technical layout considering Standardization, Reusability, and Scope to improve if need be.
  • Documented the purpose of Data Warehouse (including transformations, mapplets, mappings, sessions, and batches) so as to facilitate the personnel to understand the process and in corporate the changes as when necessary.
  • Developed complex mappings to extract source data from heterogeneous databases Tera Data, SQL Server Oracle and flat files, applied proper transformation rules and loaded in to Data Warehouse.
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Worked closely with Production Control team to schedule shell scripts, Informatica workflows and pl/sql code in Autosys.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Defects were tracked, reviewed and analyzed.
  • Conducted UAT (User Acceptance Testing) with user community
  • Developed K-shell scripts to run from Informatica pre-session, post session commands. Set up on Success and on Failure emails to send reports to the team.

Environment: Informatica PowerCenter 8.6/81/7.1, Oracle9i, SQL Server 2000, Erwin, XML, TOAD, HP - Unix 11.11, Cognos

Confidential, Mt Laurel, NJ

JAVA Developer

Responsibilities:

  • Gathered specification, designing the system, building the database, development, testing and implementing in system.
  • Developed user interface using JSP, JSP Tag libraries and Spring Tag Libraries to simplify the complexities of the application.
  • Prepared technical specifications document for the given functional specifications
  • Developed JSP Custom tags to display Data and Graphs.
  • Used Enterprise Java Beans to ease the implementation and development of application components.
  • Developed Java Beans to use in JSP's.
  • Created Stateless Session Beans for retrieving data and Entity Beans for maintaining User Profile and developed session beans to maintain authentication user roles.
  • JDBC connection pooling for accessing embedded and legacy data sources
  • Developed front-end user interface screens and server side scripts using JSP, HTML, Java Script, Servlets, Custom Tags and XML.
  • Used XML Spy for creating and validating XML files and for generating XSL style sheets.
  • Designed and Implemented Server Objects using Java Servlets, EJB, JDBC.

Environment: Java, J2EE, Servlets, JSP's, Java Scripting, Spring, Java Beans, Hibernate JSP, MVC, Tomcat, JBuilder, XML, MS SQL Server, JDBC, EJB.

We'd love your feedback!