Sr. Informatica Developer Resume
Southfield, MI
SUMMARY
- 9 years of experience in Information Technology wif a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center 9.x/8.x/7.x/6.x, Power Exchange, Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
- Experience in integration of various data sources wif Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.
- Experience in writing stored procedures and functions.
- Experience in SQL tuning using Hints, Materialized Views.
- Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method
- Tuned Complex Sql queries by looking Explain plan of teh SQL.
- Involved in teh data analysis for source and target systems. worked Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
- Experience in Data mining, finding correlations or patterns among fields in Data base.
- Implemented performance tuning techniques at application, database and system levels
- Worked in Agile methodology and Kanban method for faster development.
- Experience in UNIX shell programming.
- Worked wif Informatica Cloud for creating source and target object, developed source to target mappings.
- Understanding of Relational (ROLAP), Multidimensional (MOLAP) modeling, data warehousing concepts, star and snowflake schema, database design methodologies and Meta data management.
- Experience wif preparing functional Specifications, Low Level and High - Level Design specification and Source to target mapping documents for teh ETL programs/business processes
- Experience in working wif Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.
- Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.
- Extensive knowledge in handling slowly changing dimensions (SCD) type 1/2/3.
- Experience in using Informatica to populate teh data into Teradata DWH.
- Experience in understanding Fact Tables, Dimension Tables, Summary Tables
- Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloud watch, Cloud Front, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.
- Designed Architectural Diagrams for different applications before migrating into Amazon cloud for flexible, cost- TEMPeffective, reliable, scalable, high-performance and secured.
- Build servers using AWS: Importing volumes, launching EC2, creating security groups, auto-scaling, load balancers, Route 53, SES and SNS in teh defined virtual private connection.
- Creating alarms in Cloud Watch service for monitoring teh server performance, CPU Utilization, disk usage etc.
- Wrote adhoc data normalization jobs for new data ingested into Redshift.
- Familiar and wif advanced Amazon Redshiftand MPP database concepts.
- Moved teh company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.
- Optimizing and tuning teh Redshiftenvironment, enabling queries to perform up to 100x faster for Tableau and SAS Visual Analytics
- Designing and building multi-terabyte, full end-to-end Data Warehouse infrastructure from teh ground up on Amazon Redshiftfor large scale data handling Millions of records every day.
- Used Amazon Kinesis as a platform forstreaming dataon AWS
- Build custom streaming data applications for specialized needs by using Kinesis as a platform.
- Configured and Managed Elastic Load Balancing (ELB) to avoid fault tolerance and avoid single point of failure of applications, hence provide high availability and network load balancing.
- Worked wif networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters and AWS Cloud
- Designing and configuring teh AWS Secure Notification Service (SNS) and Secure Email Service (SES) architecture of teh solution and working wif a client.
- Managed users and groups using teh Amazon Identity and Access Management (IAM).
- Exposure to development, testing, debugging, implementation, user training & production support.
- Experience wif 24x7 production support.
- Excellent Analytical, Written and Communication skills.
TECHNICAL SKILLS
ETL: Informatica 9.6.1/9.x/8.x/7.x, Data Stage 7.5/6.0, SSIS
CLOUD SERVICES: amazon web services (CERTIFIED)
Databases: Oracle 12c/11g/10g/9i/8i/7.x, DB2, Teradata v 13/v12/v2r5, SQL Server
Reporting Tools: Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X
Database Modeling: Erwin 4.0, Rational rose
Languages: SQL, COBOL, PL/SQL, JAVA, C
DB Tools: SQL* plus, SQL* Loader, SQL*Forms, TOAD
Web Tools: HTML, XML, JavaScript, Servlets, EJB
OS: Windows NT/2000/2003/7, UNIX, Linux, AIX
PROFESSIONAL EXPERIENCE
Confidential, Southfield MI
Sr. Informatica Developer
Responsibilities:
- Collaborated wif Business Analysts for requirements gathering, business analysis and designing of teh Enterprise Data warehouse. worked in requirement analysis, ETL design and development for extracting data from teh heterogeneous source systems like DB2, MS SQL Server, Oracle, flat files, Mainframes and loading into Staging and Star Schema.
- Created/Modified Business requirement documentation.
- Created ETL Specifications using GAP Analysis.
- Did Production Support on Monthly rotation basis.
- Worked on Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
- Used Informatica Data Quality (IDQ) to profile teh data and apply rules to Membership and Provider subject areas to get Master Data Management (MDM)
- Involved in migration of teh maps from IDQ to power center
- Eliminated and prevent duplicated and inconsistent data using Informatica MDM.
- Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.
- Extracted Mainframe data for Local Landscapes like Credit Check, Application etc.
- Created Unit Test cases, supported SIT and UAT.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Applied teh rules and profiled teh source and target table's data using IDQ.
- Worked wif Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.
- Worked wif Informatica Cloud for creating source and target object, developed source to target mappings.
- Migrated Informatica code to different environment.
- Developed teh Mappings wif Best Practices and Standards.
- Implemented Incremental load logic.
- Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method, Power Exchange Navigator, Power Exchange Bulk Data movement. Power Exchange CDC can retrieve updates at user-defined intervals or in near real time.
- Involved in Design and Development of technical specifications using Hadoop Technology
- Worked on debugging code to find bottle necks for performance tuning.
- Extracted data from sources like SQL server, and Fixed width and Delimited Flat files. Transformed teh data according to teh business requirement and then Loaded into Oracle and Teradata databases.
- Experience in writing SQL queries and optimizing teh queries in SQL Server.
- Performeddata analysisanddata profilingusingSQLon various sources systems including SQL Server
- Designed and developed complex mappings to load teh Historical and Weekly data from teh Legacy Systems to Oracle database.
- Worked wif Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fast load, Fast Export.
- Created CISM ticket wif HP to fix DB2 issues.
- Modified MORIS (Trucks) data mart load wif enhancements.
- Created Defects in ALM software and assigned to developers.
- Fine Tuned teh Complex SQL by checking Explain Plan of teh queries.
- Improved ETL job duration/run time from hours to minutes where no push up and partition process unavailable.
- Created mappings to load data from ODS to DWH.
- Used DB2 Loader utility to load large tables efficiently.
- Created session partitions to load data faster.
- Extensively used pm cmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files.
- Used Informatica Metadata Manager to find teh data lineage in teh Mappings.
- Working in Agile Methodology SDLC.
- Did Analysis for ABC (Automated Balance and Control) for each Business Areas and recommended teh changes to capture Stats for each Workflow and its Sessions.
- Tuned ETL Load by checking Load on teh Server CPU.
- Ftp’d teh File using Session property FTP Connection to Informatica Server.
- Used Informatica Power Exchange to read Mainframe files.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
Environment: Informatica Power Center 9.6.1, Informatica Metadata Manager, DB2 9.8, Rapid SQL 8.2, IBM Data Studio, PLSQL, Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.
Confidential, Pittsburgh, PA
Sr. Informatica Developer
Responsibilities:
- Designed to load data from various sources like Oracle, Flat files, XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.
- Moved teh data from source systems to different schemas based on teh dimensions and fact tables by using teh slowly changing dimensions type 2 and type 1.
- Experience in Data mining, finding correlations or patterns among fields in Data base.
- Hands on working experience in building MDM composite services using Services Integration Framework (SIF) including setup and configuring SIF SDK.
- Analyzed teh suspects in MDM while loading teh data.
- Interacted wif teh client for teh requirements on MDM.
- Worked on Teradata databases in profiling teh data from teh production tables from different sources.
- Involved in Project Agile Roadmap Estimation, Pursuits, Solutions Architecture, Hadoop best practice, and architecture design/implementation wifin teh company
- Involved in column profiling of teh tables using Informatica Analyst Tool (IDQ).
- Analyzing teh requirements for profiling from MDM.
- Worked wif Informatica cloud to create Source/Target connections, monitor, and synchronize teh data.
- Experience in implementing Data Quality rules on Hive Data sources using Informatica Big Data Edition 10.x.
- Informatica Data Quality(IDQ) 9.6.1, Informatica Data Quality(IDQ) 10.1.1, Teradata 15.10, Oracle 11g, UC4, Windows, WinSCP, Toad, Big Data Edition 10.1.0, Hadoop Hortonworks, Hive 2.4
- Worked wif Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.
- Worked wif Informatica Cloud for creating source and target object, developed source to target mappings.
- Used IDQ to profile teh project source data, define or confirm teh definition of teh metadata, cleanse and accuracy check teh project data, check for duplicate or redundant records, and provide information on how to proceed wif ETL processes
- Involved in data profiling using IDQ prior to data staging.
- Extensively used XML Transformation, Normalizer, Lookups, Expression, and Aggregator, sequence Generator, Sorter and Joiner Transformations.
- Developed XML mapping for ETL team for teh transformation and loading.
- Defining teh code values and making changes according to values in mapping and explaining it to ETL team for teh transformation rules
- Worked on XML Source Files.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
- Responsible for migration of teh work from dev environment to testing environment
- Responsible for solving teh testing issues.
- Created documents that has teh detail explanation of teh mappings, test cases and teh expected results and teh actual results.
Environment: Informatica Power Center 9.5, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, SQL*Plus, IDQ, XML,XSD, T-SQL, Oracle SQL developer.
Confidential, Pittsburgh, PA
ETL Developer
Responsibilities:
- Worked on Data mining to analyzing data from different perspective and summarizing into useful information.
- Involved in requirement analysis, ETL design and development for extracting data from teh heterogeneous source systems like MS SQL Server, Oracle, flat files and loading into Staging and Star Schema.
- Used SQL Assistant to querying Teradata tables.
- Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Waterfall, Spiral, Agile.
- Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another. involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
- Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.
- Involved in massive data cleansing prior to data staging.
- Configured workflows wif Email Task, which would send mail wif session, log for Failure of a sessions and for Target Failed Rows.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Worked on Change Data Capture (CDC) using CHKSUM to handle any change in teh data if their is no flag or date column present to represent teh changed row.
- Worked on reusable code known as Tie outs to maintain teh data consistency. It compared teh source and target after teh ETL loading is complete to validate no loss of data during teh ETL process.
- Cleanse, standardize and enrich customer information using Informatica MDM.
- Worked wif data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.
- Used teh Teradata fast load/Multiload utilities to load data into tables.
- Worked wif UNIX shell scripts extensively for job execution and automation.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
- Involved in Unit testing, User Acceptance Testing to check whether teh data is loading into target, which was extracted from different source systems according to teh user requirements.
Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, Unix, Cognos, Tidal.
Confidential, Ann Arbor, MI
Sr. Informatica Developer
Responsibilities:
- Coordinated wif business analysts and data architecture team to ensure compliance wif standards and consistency wif long-term infrastructure plans.
- Prepared Detail Design specification and functional Specifications for teh ETL programs/business processes.
- Involved in Design Review, code review, test review, and gave valuable suggestions.
- Worked wif different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing teh Mappings.
- Performed data reconciliation in various source systems and in Teradata.
- Took part in migration of jobs from UIT to SIT and to UAT.
- Involved in Informatica Code Migration across various Environments.
- Performed troubleshooting on teh load failure cases, including database problems.
- Created database objects like Tables (also global/volatile), macros, views, and procedures.
- Documented all teh designs from source to stage, stage to integration and integration to atomic layer.
- Co-ordinate wif teh off-shore team and mentored junior developers.
Environment: Informatica Power Center 9.5/9.1, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, Erwin, AIX, Shell Scripts, Auto sys, UNIX.
Confidential, Dallas, TX
Programmer Analyst
Responsibilities:
- Involved wif requirement gathering and analysis for teh data marts focusing on data analysis, data quality, data mapping between staging tables and data warehouses/data marts.
- Designed and developed processes to support data quality issues and detection and resolutions of error conditions.
- Wrote SQL scripts to validate and correct inconsistent data in teh staging database before loading data into databases.
- Analyzed teh session logs, bad files and error tables for troubleshooting mappings and sessions.
- Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.
- Scheduled workflows using Autosys job plan.
- Provided production support and involved wif root cause analysis, bug fixing and promptly updating teh business users on day-to-day production issues.
Environment: Informatica Power Center 8.6.1/7.x, Oracle 10g, Autosys, Erwin 4.5, CMS, TOAD 9.0, SQL, PL/SQL, UNIX, SQL Loader, MS SQL Server 2008.