We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

NY

SUMMARY

  • 8+ years of experience in Information Technology wif a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center, Power Exchange, Designer (Source Analyzer, Data warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Monitor.
  • Experience in integration of various data sources wif Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.
  • Experience in writing stored procedures and functions, SQL tuning using Hints, Materialized Views.
  • Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method Tuned Complex Sq queries by looking Explain plan of teh SQL.
  • Involved in teh data analysis for source, target systems, Master Data Management Methodologies, concepts, and ability to apply this noledge in building MDM solutions.
  • Experience in Data mining, finding correlations or patterns among fields in Data base.
  • Implemented performance tuning techniques at application, database and system levels
  • Worked in Agile methodology and Kanban method for faster development.
  • Understanding of Relational (ROLAP), Multidimensional (MOLAP) modeling, data warehousing concepts, star and snowflake schema, database design methodologies and Meta data management.
  • Experience wif preparing functional Specifications, Low Level and High - Level Design specification and Source to target mapping documents for teh ETL programs/business processes
  • Experience in working wif Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.
  • Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Redhat Linux.
  • Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups,
  • Cloud watch, Cloud Front, IAM for installing configuring and troubleshooting on various Confidential images for server migration from physical into cloud.
  • Designed Architectural Diagrams for different applications before migrating into Confidential cloud for flexible, cost- effective, reliable, salable, high-performance and secured.
  • Build servers using AWS: Importing volumes, launching EC2, creating security groups, auto-scaling, load balancers, Route 53, SES and SNS in teh defined virtual private connection.
  • Wrote ad-hoc data normalization jobs for new data ingested into Redshift.
  • Familiar and wif advanced Confidential Redshift and MPP database concepts.
  • Moved teh company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.
  • Optimizing and tuning teh Redshift environment, enabling queries to perform up to 100x faster for Tableau and SAS Visual Analytics.
  • Designing and building multi-terabyte, full end-to-end Data Warehouse infrastructure from teh ground up on Confidential Redshift for large scale data handling Millions of records every day.
  • Used Confidential Kinesis as a platform for streaming data on AWS.
  • Build custom streaming data applications for specialized needs by using Kinesis as a platform.
  • Configured and Managed Elastic Load Balancing (ELB) to avoid fault tolerance and avoid single point of failure of applications, hence provide high availability and network load balancing.
  • Worked wif networking teams in configuring AWS Direct Connect to establish dedicated connection to data centers and AWS Cloud.
  • Designing and configuring teh AWS Secure Notification Service (SNS) and Secure Email Service (SES) architecture of teh solution and working wif a client.
  • Managed users and groups using teh Confidential Identity and Access Management (IAM).
  • Exposure to development, testing, debugging, implementation, user training & production support.
  • Excellent Analytical, Written and Communication skills.

TECHNICAL SKILLS

ETL Tools: Informatica 6.0, 6.1, 7.1, 8,9, SSIS, Data Stage 7.5/6.0

RDBMS: Oracle 12c/11g/10¢/9i/8i/7.x, DB2, Teradata v 13/v12/v2r5, DB2, Sql Server 2000/2005

Database skills: Stored Procedures, Database Triggers, and Packages

Reporting Tools: Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X

BI / OLAP tools: Cognos Series 8 v8.2/8.3/8.4, SSRS, Business Objects.

Languages: SQL, HTML/DHTML, UNIX, C, C++, Java, COBOL, VB .net, Assembly language

Web Servers: IIS v4.0/5.0/6.0, Apache Tomcat

OS: UNIX/Solaris, Red Hat Linux, AIX, Windows 2000/2003/XP/vista

Hardware: IBM PC Compatibles and NT servers

Version Control: Visual Source Safe 6.0, CVS, Documentum

Tools: Erwin 3.x, MS Project, MS Visio, Make, Autosys

PROFESSIONAL EXPERIENCE

Confidential, NY

Sr. Informatica Developer

Responsibilities:

  • Involved in requirement, analysis and understanding of business requirements identify teh flow of information and analyzing teh existing systems.
  • Analyzed teh source data coming from Oracle, Flat Files, and DB2 coordinated wif data warehouse team in developing Dimensional Model.
  • Involved in teh data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files.
  • Extensively worked on Informatica Power Center tools- Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Worked on Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Stored Procedure, Sequence Generator, Filter, Sorter, and Source Qualifier.
  • Involved in Design and Development of technical specifications using Hadoop Technology.
  • Worked on debugging code to find bottle necks for performance tuning.
  • Extracted data from sources like SQL server, and Fixed width and Delimited Flat files. Transformed teh data according to teh business requirement and then Loaded into Oracle and Teradata databases.
  • Experience in writing SQL queries and optimizing teh queries in SQL Server.
  • Performed data analysis and data profiling using SQL on various sources systems including SQL Server
  • Designed, developed complex mappings to load Historical, Weekly data from Legacy Systems to Oracle.
  • Worked on Teradata databases in profiling teh data from teh production tables from different sources.
  • Used Informatica Data Quality (IDQ) to profile teh data and apply rules to Membership and Provider subject areas to get Master Data Management (MDM).
  • Involved in migration of teh maps from IDQ to power center.
  • Eliminated and prevent duplicated and inconsistent data using Informatica MDM.
  • Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.
  • Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another. involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
  • Created Unit Test cases, supported SIT and UAT.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ's standardized plans for addresses and names clean ups.
  • Applied teh rules and profiled teh source and target table's data using IDQ.
  • Created shell scripts to run daily jobs and extract files from remote location for data loads.
  • Used Informatica Metadata Manager to find teh data lineage in teh Mappings.
  • Did Analysis for ABC (Automated Balance and Control) for each Business Areas and recommended teh changes to capture Stats for each Workflow and its Sessions.
  • Ftp’d teh File using Session property FTP Connection to Informatica Server.
  • Used Informatica Power Exchange to read Mainframe files.
  • Created reusable Mappings, Mapplet, Transformations and Parameter files for Data Quality Plan.
  • Improved Informatica process creating index, used cache files and session level partitions in Informatica.
  • Created, updated and maintained ETL technical documentation for Production Migration.

Environment: Informatica Power Center 9.x/8.x, Oracle 10g, PL/SQL, Teradata, Data Flux, BPM(Business Process Management), Win CVS, Window XP, DB2, UNIX, Informatica Metadata Manager, DB2 9.8, Rapid SQL,IBM Data Studio, Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.

Confidential, Norfolk, VA

Informatica Developer

Responsibilities:

  • Requirement gathering, analysis and designing technical specifications for teh data migration according to teh business requirement.
  • Developed logical and physical dimensional data models using ERWIN 7.1
  • Developed test cases and test plans to complete teh unit testing. Support System testing.
  • Planned and coordinated testing across multiple teams, tracked and reported status, created testing cycle plan, scenarios etc.
  • Designed, developed and improved complex ETL structures to extract, transform and load data from multiple data sources into data warehouse and other databases based on business requirements.
  • Developed complex mappings and SCD Type-l, Type-Il and Type III mappings in Informatica to load teh data from various sources using different transformations like Source Qualifier, Lookup,
  • Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Normalize, Filter, Rank and Router, Stored Procedure, XML and SQL transformations.
  • Responsible for normalizing COBOL files using Normalize Transformation.
  • Responsible for testing, modifying, debugging, documenting, implementation of Informatica mappings.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes Troubleshoot data issues, validated result sets, recommended and implemented process improvements.
  • Extensively worked wif performance tuning of Oracle.
  • Used analyze, dbms stats, explain plan, sql trace, sql hints and Tkprof to tune sql queries.
  • Extensively used Oracle partitioning (range/hash/list), indexes (bitmap, B-tree, reverse key, etc) and various join types (hash joins, sort merge, nested iteration join) to improve teh performance.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Hands on experience wif Informatica Metadata Manager.
  • Design/developed and managed Power Center upgrades from v7.x to v8.6, Migrate ETL code from
  • Informatica v7.x to v8.6. Integrate and managed workload of Power Exchange CDC.
  • Worked wif incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.
  • Developed user defined functions (UDF) to extract data from flat files.
  • Worked wif Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.
  • Developed and modified UNIX korn shell scripts to meet teh requirements after teh system modifications and was also involved in monitoring and maintenance of teh batch jobs.
  • Worked wif Autosys for Job scheduling.
  • Worked wif complex Cognos reports in Report Studio using master-detail relationship, drill through, drill up and drill down, burst options, and Prompts.
  • Created dashboards for a managerial overview wif drill up and drill down capabilities.
  • Created complex reports that had multiple pages, multiple Ul items like lists, charts, graphs
  • Used Data Source Query Subjects, Model Query Subjects and Stored Procedure Query Subjects in Framework manager.
  • Assist Lead Architect wif project planning, analysis, design, testing, documentation, and user training.
  • Managed work assignments and coordinated wifin teh development team

Environment: Informatica PowerCenter 7.1.x, 8.6,Oracle 10g & 11g, PL/SQL, Cognos 8.3,Toad, Unix, Erwin7.1,Windows XP, Oracle, Autosys.

Confidential

Informatica Developer

Responsibilities:

  • Source system evaluation, standardizing received data, understanding business/data transformation rules, business structure, hierarchy, relationships, data transformation through mapping development, validation and testing of mappings.
  • Developed logical & physical modeling as per business requirements using ERWIN 4.0
  • Performed requirement analysis and developed designing technical specifications for teh data migration according to teh business requirement.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe systems.
  • Developed complex ETL structures to extract, transform and load data from multiple data sources into data warehouse based on business requirements.
  • Replacing mappings wif mapplets which were repeatedly used for formatting data, data type conversion.
  • Ensured accurate, appropriate and effective use of data; including data definition, structure, documentation, long-range requirements, and operational guidelines.
  • Monitored ETL process activity and utilization, wif particular strengths in performance tuning highly transactions data integration package in both teh development and production environments.
  • Responsible for troubleshooting and resolving issues related to system performance, Informatica applications, and data integrity
  • Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
  • Created load scripts using Teradata Fast Load and Mload utilities and procedures in SQL Assistant. Offered production support for daily jobs.
  • Worked wif UNIX commands and used Unix shell scripting to automate jobs.
  • Wrote UNIX (korn shell scripts) to backup teh log files in QA and production.
  • Performed Unit and Integration testing and wrote test cases.
  • Worked extensively in defect remediation and supported teh QA testing.
  • Involved in periodic maintenance of environments by applying patches and content upgrades.
  • Involved in code migration for versioned repositories.
  • Involved in taking repository backup and restoring it, starting and stopping Informatica services and worked wif pmcmd commands.
  • Created Drill Through Reports & Master Detail Reports for Sales Department.
  • Extensively used prompts, filters, cascading prompts, calculations, conditional variables, multiple queries for data extraction in reports.

Environment: Informatica PowerCenter 8.1, Informatica Power Exchange, Teradata, PL/SQL, SQL*PlusSQL* Loader, Toad, UNIX, Windows XP, Erwin 4.0, Cognos 8.3.

Confidential

Informatica Consultant

Responsibilities:

  • Designed and developed several ETL scripts using Informatica, UNIX shell scripts.
  • Extensively used all teh features of Informatica Versions 6.x and 7.x including Designer, Workflow manager and Repository Manager, Workflow monitor.
  • Developed and modified UNIX shell scripts to reset and run Informatica workflows using pmcmd on Unix Environment. Conversant wif teh Informatica API calls.
  • Worked extensively wif mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations. ptimization, performance tuning of Informatica objects, database objects to achieve better performance.
  • Extensively worked wif SQL queries, created stored procedures, packages, triggers, views using PL/SQL Programming.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • Improved session run times by partitioning teh sessions. Was also involved into database fine tuning(creating indexes, stored procedures, etc), partitioning oracle databases.
  • Setting up Batches, Sessions to schedule loads at required frequency using Power Center Server manager.
  • Worked on several data mart projects as a senior data warehouse architect, was involved incomplete system development life cycle, starting from requirement specifications to delivering teh tested product.
  • Extensively worked wif SQL queries. Created Cursors, functions, stored procedures, packages, Triggers, views, materialized views using PL/SQL Programming.
  • Extensively worked wif performance tuning of Oracle
  • Extensively used Oracle partitioning (range/hash/list), indexes (bitmap, B-tree, reverse key, etc), various join types (hash joins, sort merge, nested iteration join) to improve teh performance.
  • Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader
  • Created load scripts using Teradata Fast Load and Mload utilities and procedures in SQL Assistant.
  • Creating ad-hoc reports/ Migrating reports from Cognos Impromptu
  • Mentoring Junior Informatica developers and worked wif off-shore teams for development

Environment: Informatica Power Center 7.1.x, Oracle 9i, Teradata, PL/SQL, Toad, Erwin 3.5.2, Report Net 1.1, UNIX, Shell Scripting, Windows XP, MS-Access.

Confidential

Informatica Developer

Responsibilities:

  • Assist data modeling using Erwin 3.5.
  • Analyzed teh Specifications and identifying teh source data needs to be moved to data warehouse.
  • Involved in Database design, entity relationship modeling and dimensional modeling using Star schema.
  • Extensively used all teh features of Informatica Versions 7.x including Designer, Workflow manager and Repository Manager, Workflow monitor.
  • Developed and modified UNIX shell scripts to reset and run Informatica workflows using pmcmd on UNIX Environment. Conversant wif teh Informatica API calls.
  • ETL mappings were developed to perform tasks like validating file formats, business rules, database rules and statistical operations
  • Worked extensively wif mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • Create test cases for teh above projects in providing error free solution. Monitored Workflows and sessions using Workflow Monitor.
  • Debug through Session logs and fix issues utilizing database for efficient transformation of data.
  • Prepared Mapping documents, Data Migration documents, and other project related documents like mapping templates and VISIO diagrams.
  • Develop technical documents in ease of improvements on existing code.
  • Involved in data conversion, migration, integration quality profiling tasks.
  • Implemented Filters, Calculations, Conditions, and Graphs & Prompts in Impromptu reports
  • Developing reports in ReportNet1.1 using Report Studio & Query Studio.
  • Creating ad-hoc reports/ Migrating reports from Cognos Impromptu

Environment: Informatica Power Center 7.1.x, Oracle 9i, DB2, MSSQL Server 2005 PL/SQL, SQL*PlusSQL* Loader, Cognos 7 Series, Toad, UNIX, Windows XP.

We'd love your feedback!