We provide IT Staff Augmentation Services!

Sr. Informatica Administrator Resume

4.00/5 (Submit Your Rating)

Racine, WI

PROFESSIONAL SUMMARY:

  • Over 5 years of specialized IT experience in building Enterprise Data Warehouse, Data Marts, Operational data stores in major industry sectors like finance, Health, Insurance and Retail using Informatica Power center 9.6.1/9.5.1 /8.6.1/7.1.1.
  • Expertise in working with relational databases such as Oracle 11g/10g/9i, Teradata 14/12, SQL Server 2000/2005/2008 and DB2.
  • Proficient in entire life cycle of DWH with project scope, Analysis, gathering requirements, data modeling, ETL Design, development, Unit / System testing, Production deployment and production support.
  • Strong experience in understanding Dimensional Modeling like star and snowflake schema and also identifying the Facts and Dimensions.
  • Strong experience in Enterprise Data Warehouse environment with Informatica extracting/loading data to/from Teradata.
  • Expertise in using Teradata database and utilities like TPT, FASTLOAD, MULTILOAD, and BTEQ scripts.
  • Experience in Teradata administrative activities and query tuning.
  • Extensive experience in developing Stored Procedures, Packages, Functions, Views and Triggers, Complex SQL queries and Oracle PL/SQL.
  • Experience in resolving on - going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experience in managing the non-production data in lower environments using Test Data Management.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Experience in designing Profiles and scorecards using Developer (MRS) and Analyst tool.
  • Experience in building mapplet, physical and logical data objects for the data quality objects
  • Good Knowledge in Informatica MDM implementation and Informatica DT Studio implementation.
  • Extensive experience in writing UNIX Shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Acquired experience with all the aspects of SDLC (Software Development Life Cycle) like Planning, Analysis Design, Deployment, Testing, Integration and Support.
  • Expertise with slowly changing dimensions and change data capture of the Data Warehouse Environment.
  • Experience in using Automation Scheduling tools like Autosys, TWS and Control-M.
  • Experience in using the BI reporting tools like SSRS, SSIS, Tableau, Share Point and Arc plan.
  • Expertise in Logical and Physical design of databases for OLTP and Data Warehouse Environments.
  • Experience in planning production code deployment and implementing with no effect on the other scheduled jobs in the data warehouse environment.
  • Good communication skills with strong ability to interact with customers, end-users and team members.
  • Solid analytical and dynamic troubleshooting skills and Self-motivated and enjoy working in a technically challenging environment.
  • Provided 24*7 on-call support while working on production environments.
  • Provided L3 and L2 technical support during the production code deployments of the projects.
  • Experience in solving the technical and analytical issues within the SLAs defined in Service Now.

TECHNICAL SKILLS:

Operating systems: Windows, UNIX and LINUX

Databases: Oracle 12c/11g/10g/9i/8i, Teradata 14.10/14/13/12, DB2, MS SQL Server 2000/2005/2008, Sybase 12.0, DB2, Hadoop (Horton Works 2.0/2.2)

Modeling: Dimensional Data Modeling, Star and Snow-Flake Schema Modeling, Erwin, Microsoft Visio

ETL Tools: Informatica Power Center 9.6.1/9.5.1 /8.6.1/7.1.1 and Informatica BDE 9.6.1

Reporting Tools: Tableau, Share Point, SSRS, Arc Plan, SSIS

Languages: Java, XML, UNIX Shell Scripting, SQL, PL/SQL

PROFESSIONAL EXPERIENCE:

Confidential, Racine, WI

Sr. Informatica Administrator

Responsibilities:

  • Created the technical documentation of the architecture of the data warehouse, including the physical components and their functionality following the client standards.
  • Performed the troubleshooting of the technical issues and incidents related to the Informatica, Teradata and Hadoop applications.
  • Performed the query tuning and the code performance tuning during the testing phase of the project.
  • Performed tests for evaluating the hardware and software needed to support the data warehouse that includes Informatica BDE, HDFS, and Teradata, data integration, BI and analytics.
  • Analyzed the existing logic of Informatica & Teradata to understand the data loading patterns and designed the process to load the missing data.
  • Performed Teradata Query tuning and utilized Primary and secondary indexes to improve the performance.
  • Performed Teradata administrative activities such as user creation, database creation, memory allocation, Table and view creation using Teradata administrator.
  • Used Teradata View Point for monitoring the query performance of the running sessions.
  • Developed test cases for the UAT/QA testing before deploying the code to production environment.
  • Performed Informatica server maintenance by checking the space/disk usage, CPU usage and the node availability.
  • Performed Data masking for loading the non- confidential data into lower environments
  • Developed Informatica code to extract the data from SAP and load into Hadoop database at HDFS and hive level.
  • Worked on Hadoop map reduce logs in identifying the issues with the Informatica code extracting/loading data to/from Hadoop.
  • Created Tables and databases in Hadoop as required for the staging process.
  • Maintained the user and system security of the Informatica, Teradata, Hadoop and Tableau applications.
  • Designed specifications for client machines, application servers, database servers, and networks.
  • Participated in software upgrades and patches including coordination, testing and stabilization.
  • Performed upgrade of Informatica BDE from 9.6.1 to 9.6.1 Hot Fix 3 including the Hadoop upgrade from Horton Works 2.0 to 2.2.
  • Interacted with vendor support to address technology and software issues related to Informatica, Hadoop and Tableau.
  • Performed three Emergency Break Fixes on Informatica and one Emergency Break Fix on Hadoop to address an issue with the version that is being used.
  • Developed a plan for the upgrade of the applications to the latest stabilized version like upgrading Informatica to 10.1, Horton Works to 2.8 and Tableau to 10.
  • Implemented Kerberos on Informatica and Hadoop applications.
  • Developed a plan for the production code deployments for different projects and implemented them in a timely manner without affecting the current schedule.
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Lookup, Filter and Union in resolving the issues.
  • Provided support for the Informatica Data Quality profiles and scorecards that performs the data quality analysis of the data from SAP, Db2 and Hadoop systems.
  • Extensively worked with session logs, Profile logs and workflow logs in doing Error Handling and trouble shooting.
  • Performed trouble shooting and performance tuning for the IDQ code using the Informatica Analyst tool.
  • Performed Informatica administrative activities like scheduling the repository backups, creating the new users/folders/projects/connections, maintaining security, LDAP user maintenance, object migration, etc.
  • Involved in doing Unit Testing, Integration Testing and Data Validation.
  • Worked with the TWS (Tivoli Workload Scheduler) and Autosys scheduling team in scheduling Informatica Jobs as per the requirement and frequency.
  • Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping and session and optimizing them using the Debugger Wizard in the Power Center tool.
  • Performed the system maintenance like scheduling the repository and domain backups, log cleanup using the shell scripts and weekly system checks for the capacity planning.
  • Created UNIX shell scripts and parameter files to reuse the existing code.
  • Worked with the Tableau team in creating dashboards, generating various reports and maintain the confidentiality of the different reports.
  • Involved in preparing the Migration Documents and standards across all the platforms.
  • Provides L3 On-call support for engineering related issues for the Information Management stack.

Environment: Informatica Power Center 9.6.1/9.6.1 HF3, Informatica developer 9.6.1 HF3, Informatica Analyst, Teradata 14, Teradata DBA tools, DB2, SAP, XML, Flat Files, SQL Assistant, Unix shell scripting, Horton Works 2.2/2.0, Tableau 9.3/8.3, SSRS, OLAP, Arc Plan 8.

Confidential, Dallas, TX

Sr. Informatica administrator

Responsibilities:

  • Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
  • Extensively used Power Center to design multiple mappings with embedded business logic.
  • Involved in discussion of user and business requirements with business team.
  • Performed data migration in different sites on regular basis.
  • Involved in upgrade of Informatica from 9.1 to 9.5.
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
  • Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Parameter files.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.
  • Analyzed session log files in session failures to resolve errors in mapping or session configuration.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
  • Created mapplet and used them in different mappings.
  • Performed tuning of the Teradata BTEQ scripts.
  • Involved in Teradata upgrade from TD12 to TD14 for analysis, testing and document preparation.
  • Worked on Flat Files and XML, DB2, Oracle as sources.
  • Worked on Informatica B2B data transformation using Informatica Data Transformation Studio.
  • Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.
  • Implemented Slowly Changing Dimension Type II for different Dimensions.
  • Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
  • Worked with Informatica version Control excessively.
  • Written Unit test scripts to test the developed interfaces.
  • Managed enhancements and coordinated with every release with in Informatica objects.
  • Provided support for the production department in handling the data warehouse.
  • Worked under Agile methodology and used version one to track the tasks.
  • Written thorough design docs, unit test documentation, Installation and configuration guide documents.
  • Performed bulk data imports and created stored procedures, functions, views and queries.

Environment: Informatica Power Center 9.6.1/9.5.1, Informatica DT Studio, Teradata 14/12, Oracle 10g, DB2, XML, Flat Files, Sql Assistant, Toad, PL/SQL, Unix shell scripting, Cognos, SVN, Windows 7.

Confidential, DALLAS, TEXAS

Sr. ETL Informatica Developer/ Administrator

Responsibilities:

  • Maintain Data Flow Diagrams (DFD’s) and ETL Technical Specs or lower level design documents for all the source applications.
  • Involved in Requirement analysis in support ofData Warehousing efforts.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Worked with source databases like Oracle, SQL Server and Flat Files.
  • Extensively worked with Teradata utilities BTEQ, F-Load, M-load & TPT to load data in to ware house.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, transform, cleanse data and load it into data marts.
  • Created complex mappings using Unconnected and Connected Lookup Transformations.
  • Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level.
  • Implemented Slowly changing dimension Type 1 and Type 2 for change data capture.
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.
  • Worked extensively with update strategy transformation for implementing inserts and updates.
  • Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
  • Auditing is captured in the audit table and EOD snapshot of daily entry is sent to the distributed list to analyze if there are any abnormalities.
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre-session commands.
  • Extensively used debugger to test the logic implemented in the mappings.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
  • Monitored workflows and session using Power Center workflow monitor.
  • Provided 24*7 supports for Production environment for Extraction, Loading and transformation of data by remote monitoring.
  • Monitoring the Extraction and loading processes of data and Involved in writing UNIX shell scripting for automating the jobs.

Environment: Informatica Power Center 9.1.1, Oracle, SQL server, My SQL, Flat Files, Teradata 12, Cognos, Unix AIX and Windows XP.

Confidential, CA

Informatica Developer

Responsibilities:

  • Utilized all the features of Source Qualifier transformation such as filter, joiner, sorter and SQL override to the extend level at the source level.
  • Worked with Business Data Analysts (BDA) to understand the requirements for MDW development.
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Lookup, Filter and Union in developing the mappings to migrate the data from source to target.
  • Extensively used Lookup transformation and Lookup Caches in looking the data from relational and Flat Files.
  • Extracted data from various heterogeneous sources like Sybase, Flat Files and COBOL (VSAM) using Informatica Power center and loaded data in target database DB2.
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.
  • Involved in doing Unit Testing, Integration Testing and Data Validation.
  • Worked with the Control M scheduling team in scheduling Informatica Jobs as per the requirement and frequency.
  • Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping and session and optimizing them.
  • Used FACETS in extracting the sampling data.
  • Involved in DWH up gradation for source system changes.
  • Created Mapping parameters and Variables and written parameter files.
  • Created UNIX shell scripts for various needs.
  • Worked with the Debugger Wizard in debugging the mappings.
  • Used Normalize Transformation for Cobol (VSAM) sources.
  • Worked with External stored procedures for data cleansing purpose.
  • Worked with the Cognos team in generating various reports.
  • Worked with the HEDIS team for reporting.
  • Implemented Informatica Procedures and Standards while developing and testing the Informatica objects.
  • Successful in Providing 24x7 On-call Support for Production databases.

Environment: Informatica Power Center 8.6.1, Sybase, DB2, Flat Files, Cobol (VSAM), Win SQL, FACETS, HEDIS, Toad, Ultra Edit - 32, SQL Advantage, Control Center, Power Designer SQL Modeler, CDMA, MS-Access, MS-Visio, Cognos, UNIX, Windows XP Professional.

Confidential

Jr. ETL Developer

Responsibilities:

  • Maintained user roles and privileges at the database level It involved enrolling users, maintaining the system security, controlling and monitoring user access.
  • Involved in designing and developing the database for the business division of Creative Eye at Focus for effectively managing manufacturing processes.
  • Designed and developed database views and stored procedures, Packages and Functions.
  • Used SQL to extract the data from the database and also using views on the tables.
  • Worked with Development Life Cycle teams in performing the Data Movement.
  • Wrote stored procedures, functions &Packages and used in many Forms and Reports.
  • Wrote database triggers for automatic updating the tables and views.
  • Designed and developed forms and reports.
  • Involved in the creation of jobs using Informatica workflow manager to validate schedule run and monitor jobs using Workflow Monitor.
  • Involved in the preparation of documentation for ETL using Informatica standards, procedures and naming conventions.
  • Various features of PL/SQL such as Dynamic SQL and parameter passing of PL/SQL tables were evaluated.
  • Developed SQL Applications for extracting the data from the Oracle tables.
  • Created detailed system design specifications to serve as a guide for system and the program development.
  • Wrote triggers and packages by using PL/SQL for giving security privileges.

Environment: Informatica Power Center 7.1.1, Oracle, Reports 25, SQL, PL/SQL, Windows NT, Erwin, UNIX

We'd love your feedback!