Informatica admin Resume
New York, NY
PROFESSIONAL SUMMARY:
- Software IT Professional with 7 years of diversified experience in Informatica PowerCenter, Data Quality, B2B, Master Data Management, Talend Data Integration / Data Quality / MDM, Abinitio, Oracle PL/SQL, Teradata PL/ SQL, Netezza PL /SQL and UNIX Shell Scripting.
- Expertise experience in working with the clients Confidential, Confidential, Confidential, Confidential and Confidential and in the business domains BFSI (Banking, Financial Services and Insurance), Mortgage, Retail and Healthcare.
- Expertise in handling various aspects of Business Intelligence viz. Data Warehouse Design, Development, Administration and Architecture, ETL Administration, Development and Production Support.
- Worked extensively in building Dimensions, Bridges, Facts, and Star Schemas, Snow Flake (Extended Star) Schemas and Galaxy Schemas.
- Expertise in database normalization techniques and developed Data Marts and Data Warehouses using Dimensional Modeling based on the Ralph Kimball’s design methodology, Relational On - Line Analytical Processing (ROLAP) approach & Teradata Financial services Logical Data model (FSLDM).
- Worked extensively in building Dimensions, Bridges, Facts, and Star Schemas, Snow Flake (Extended Star) Schemas and Galaxy Schemas.
- Comprehensive experience of working with Type1, Type2 and Type 3 methodologies for Slowly Changing Dimensions (SCD) management.
- Good understanding of the Teradata / Netezza Parallel architecture model.
- Good understanding end to end architecture on server and how ETL / Big data tools (Informatica / Talend / Abinitio / Apache Big data products) interact with it.
- Extensive experience in writing UNIX automation scripts using Informatica Command Line Interface (CLI) commands viz. pmcmd, pmrep, infacmd.sh, infasetup.sh and server monitoring commands viz. iostat, vmstat, mpstat, top.
- Extensive Experience working with Solaris, Suse Linux and AIX servers and Clustered File systems viz. VxCFS and GPFS and good understanding of their architecture.
- Good understanding of Data storage and Networking.
- Extensive experience in capacity planning and bench marking of server memory, C.P.U, Storage and Network.
- Expertise in handling multiple projects simultaneously and design / development of client-server and multi-tier applications and Decision Support systems and solutions.
- Expertise in working on administration of 11 different application servers at the same time.
- Expertise in working on server migration projects and doing end to end set up of Informatica / Talend application servers.
- Troubleshooting and Performance tuning of Informatica Server and ETL applications.
- Experience in version upgrade projects of Informatica viz. version 8.6.1 to 9.1.0, version 9.1.0 to 9.5.1 HF2 and version 9.5.1 HF2 to 9.6.1 HF2 and setting up of Informatica MDM Multi-domain & Informatica Data Director (IDD).
- Experience in creating ADD (Architecture design documents) documents.
- Experience in Establishing best practice and standard for ETL process documents.
- Terrific individual performer and an excellent team player.
- Experience in picking new technologies very fast and excelling in a dynamic environment.
SKILL SET:
Business Area: BFSI Domain(Banking, Financial Services and Insurance), Mortgage, Retail, Pharmacy Domain
Project Development Methodologies: Agile, Agile Scrum, Kanban, Waterfall
ETL / MDM Tools: Informatica on Grid, Informatica Power Center 10.1.0/ 9.6.1 / 9.5.1 / 9.1.0/8.6.1/7.1.3 , , Informatica BDM 10.1 / 9.6.1,Informatica developer(Data Quality), Informatica B2B, Informatica Master Data Management (MDM) Multi-Domain, Informatica Metadata Manager, Informatica Test data Management (TDM), Informatica PMPC, Informatica Data Replication (IDR), Informatica Data Validation Option (DVO), Informatica Data Director(IDD), Teradata Decision Expert 12.1.2, TERADATA BTEQ, FLOAD, MLOAD,NZSQL, Talend, Abinitio
Application Server: Jboss, Apache Tomcat
Modeling Tools: ERWIN 7.3.1, ERStudio 8.0
Databases: Oracle 11g/10g/9i/8i, Teradata 12.0/13.0/14.0 , Netezza, PL/SQL Developer, Toad, SQL Plus, SQL Assistant,Teradata SQL Assistant, Aginity, DB2, CA- IDMS
Programming languages: Python, Java, Scala, R Programming, UNIX shell scripting, Windows Batch Scripting, VB Scripting, Oracle PL/SQL, Teradata SQL, Netezza SQL, NoSQL, COBOL, EZTREIVE, JCL
Scheduling/Automation Tools: AppWorx 6.1.4, UC4 Applications Manager 8.0, Control-M 7.0, Autosys, CA7 Job Scheduler
Operating Systems: UNIX(Solaris), Suse Linux, AIX, Windows XP, MVS for IBM 3090 Mainframe
Cloud Computing Tools: AWS (Amazon Web Services), IBM Soft layer
Reporting Tools: BOXI (Business Objects), PC / SAS, SAS Enterprise Guide, Teradata Decision Cast
Mainframe Tools: OLQ, DMLO, IDDM, ADSC, MAPC, ADSA, ADSL, DME, ADSALIVE, APC, XPEDITER for IBM 3090 Mainframe, Darstran.
Apache Frame work Tools: Hadoop Eco System Viz. Hadoop Common, Hadoop Distributed File System (HDFS), Hadoop YARN, Hadoop Map Reduce, Spark, SQOOP, Flume, HBASE, HIVE, PIG, HCATALOG
PROFESSIONAL EXPERIENCE:
Confidential, New York,NY
Informatica Admin
Responsibilities:
- Installed and configured Power Center 9.6.1HF3 and HF4 on LINUX platform.
- Up graded and maintained five environment like DEV, SIT, UAT, LTA & PROD
- Informatica administrative activities like creating folders, users, deploying code between various environments, maintenance of DEV, QA and Prod environments.
- Managed day-to-day Informatica administration support like domain, nodes Repository services, Integration services, domain security and privileges.
- Installation of patches, hotfixes and upgrading the tool up to date in regular basis.
- Co-ordinated with the Informatica Global Support, DBA and Network Team to work on the issues related to the job failure, performance bottlenecks and software bugs for the critical P1/P2 incidents on Level 3 Support.
- Involved in IDS code deployments from Power center developer tool to SQL Data services and troubleshooting the issues.
- Having experience on IDS code deployment to web services and fixing the issues.
- Working on IDS tool for data virtualization when doing data replica from one data base to other data base.
- Having the experience and involved in DR activities.
- Having extensive knowledge on SFTP connections configuration with key based authentication.
- Good knowledge on creating HDFS connection to Hadoop cluster servers with Kerberos authentication.
- Working on SQDATA tool which process the CDC and VSAM files from mainframe DB2 IMS data base.
- Installed the patches and involved in admin activities of SQDATA like start and stop the Apply engines and utility log clearances.
- Involved in troubleshooting the DB related issues and worked on performance improvement to increase the Latency.
- Involved in patching the SQDATA ENGINES, parsing the data and starting the Apply engines.
- Create and maintenance of Analyst services with all admin related activities like connection creations, user permissions.
- Installed JDBC-ODBC and Netezza-ODBC drivers.
- Created the Data Analyzer tool to fetch the session statistics from the repository metadata database.
Confidential, Detroit,MI
Informatica Developer/Administrator
Responsibilities:
- Install and Configured Informatica 9.5.1,9.6.1 and 10.1.
- Upgraded Informatica 9.5.1 to 9.6.1,9.6.1 to 10.1.
- Create folders, grant user access to Informatica Repository
- Applied Informatica Patches to the Informatica servers
- Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions.
- Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Union, Expressions and Aggregator transformations.
- Developed Slowly Changing Dimension for Type 2 SCD
- Used mapplets for use in mappings thereby saving valuable design time and effort
- Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows.
- Written procedures, Queries to retrieve data from EIDW and implemented in NIDW.
- Worked with Session Logs and Workflow Logs for Error handling and Troubleshooting in all environment.
- Responsible for Unit testing and Integration testing of mappings and workflows.
- Created Jobs and Job streams in Control-M scheduling tool to schedule Informatica, SQL script and shell script jobs.
- Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
Confidential, NYC,NY
ETL/ Informatica Developer
Responsibilities:
- Involved in Requirement Analysis, ETL Design and Development for extracting from the source systems and loading it into the Data mart.
- Involving in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.
- Analyzed business requirements, performed Impact Analysis, created technical design specifications, developed code, performed Code Deployment, and provided production support.
- Developed various transformations like Source Qualifier, Update Strategy, Lookup & Expression transformation, Expressions and Sequence Generator for loading the data into target table.
- Improved performance by using Explain Plan, creating appropriate indexes, queries optimization, utilizing tablespaces and partitioning schemes.
- Responsible for identifying the missed records in different stages from source to target and resolving the issues.
- Extensively worked in the performance tuning for mappings and ETL procedures both at mapping and session level.
- Supports and enhances existing Informatica environment and develops and deployment of the new Telematics capabilities.
- Ensure smooth transition of work to offshore team. Transfer ETL requirements and ensure timely completion of deliverables with offshore.
- Worked closely with database administrators and application development team(s) on the design and implementation of the database.
- Creating users, folders and assigning the read, write and execute permissions.
- Performs quality assurance testing of data integration and report development. Monitors data load operations to ensure accuracy.
Confidential, Haines City,FL
Informatica Developer
Responsibilities:
- Involved in preparing the Design documents with senior management team to understand the business requirements.
- Designed documents that contain details of mappings, sessions and workflows.
- Integrated Incremental/Delta data loading from Sources like Relational and flat file into Staging Area and into Datawarehousing.
- Prepared database scripts for creating tables in Staging and Enterprise Data Warehouse using Oracle 11g database.
- Experienced in integrating various source data using Informatica Data Quality as well as Powercenter.
- Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation and Aggregator transformtion etc for developing different Informatica mappings.
- Involved in building mapping, sessions and workflows using Informatica Powercenter Mapping Designer and Workflow Manager.
- Implemented Slowly changing Dimensional type2 mapping for Customer and Product to maintain historical data using Date Method.
- Implemented Slowly changing Dimensional type 1 mapping for Vendors to maintain current dataonly.
- Used Star Schema approach for designing of database for the datawarehouse.
- Created mappings using look-up transformation to identify various PO/SO transactional code from master tables.
- Involved in Unit Testing of mapping like Record count validation, Data integrity for Source and Target columns and Data Format check.
- Involved in Performance tuning by determining bottlenecks in sources, mappings and target.
- Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
- Prepared the Data migration document for moving code from Development to Production site using Export and Import.
Confidential, Cleveland,TN
ETL Developer
Responsibilities:
- Used Informatica Power Center 9.1 and its all features extensively to transform and load to Oracle 11g.
- Gathered requirements for the scope of loading data from various sources to database.
- Built Integration between various Sources.
- Extract Data from various sources, flat files to Transform and load into Staging.
- Designed and developed ETL Mappings using Informatica to extract data from Flat Files and XML, and to load the data into the target database.
- Worked with Variables and Parameters in the mappings.
- Used various Transformations like Router, Filter, Joiner, Update Strategy and connected and unconnected lookups for better data messaging and to migrate clean and consistent data using the Informatica PowerCenter Designer.
- Involved in the Development of Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
- Worked on troubleshooting the Mappings to improving Performance by identifying bottlenecks.
- Performance tuning of the process at the mapping level, session level, source level, and the target level.
- Week on Week Status report to manager on the process and Timelines