We provide IT Staff Augmentation Services!

Sr Informatica Cloud Developer Resume

3.00/5 (Submit Your Rating)

Rosemont, IL

SUMMARY:

  • 13 years 3 months of professional experience in Data Warehousing working as Sr ETL Informatica/Cloud Developer
  • Experience in working with Big Data Technology with Hadoop Implementation using Cloudera Hadoop 5.4.0 in Amazon Web Services ( AWS ).
  • Expertise in comprehensive Software Development Life Cycle ( SDLC ) using waterfall model and agile methodology.
  • Strong understanding of Interpreting Business Intelligence ( BI ) requirements and translate them into an effective data warehouse design using dimensional modelling techniques.
  • Expert level experience in creating requirement specifications, Detailed Design specification, Architecture Diagrams and end - user Manuel
  • Expert level experience working with Data in Data Analysis, Data Integration, Data Mapping, Data Conversion, Data Migration, Data flows and Metadata Management Services and Configuration in OLTP and OLAP environments.
  • Expert level experience in designing & creating database objects in Oracle 7/8i/9i/10g/11g, SQL Server, Teradata, DB2 for OLTP and OLAP applications.
  • Expert level experience in Developing, Testing, Deploying and Maintaining ETL work-flows for multiple data sources like flat files, XML, Teradata, DB2, Oracle, SQL Server .
  • Strong experience in developing of Complex database objects like Stored Procedures, Functions, Packages, Triggers, Tables, Views, Indexes, Constraints, Synonyms, Materialized Views, Partition Tables using SQL and PL/SQL
  • Strong Experience working with Informatica MDM as well as experience with Power Exchange
  • Experience working with Informatica (9.x) suite of products like MDM, IDQ, IDE
  • Extensive experience in design and development of Complex Mappings, Reusable Objects (Mapplets, Lookups, Transformations, Tasks and Sessions), transformations, extractions and loads.
  • Expert level development skills with Informatica Client Tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Informatica Repository Manager and Workflow Manager.
  • Expert level development skills working with file transfer utilities like NDM, FTP, and SFTP, FTP, Windows Batch.
  • Extensive experience in data migration techniques using SQL* Loader(Import/Export) .
  • Strong experience working with database modeling tools like Erwin, Toad and Version Control tools like TFS,SVN,Git.
  • Very good experience in debugging the Code and Optimize the Transaction Data Flow Good experience with performance optimization techniques in complex code using complex mappings, transformations and validations.
  • Highly skilled in Performance tuning - analyzing query plans and query tuning using the tools such as EXPLAIN PLAN and use hints wherever required
  • Strong exposure working in project manageme nt leading technical teams in the delivery of large scale software development, operational and integration efforts.
  • Expect level skills in resolving highly complex software development issues that arise in production environments
  • Strong exposure handling the production support emergency response calls Level l support.
  • Expert level skills resolving business difficulties and shouldering technical and delivery responsibilities.
  • Accustomed to working in fast paced environments with the ability to think quickly and successfully handle demanding clients
  • Excellent customer service orientation and outstanding verbal and written communication skills.
  • Excellent trouble shooting skills coupled with strong analytical and problem-solving skills.
  • Excellent Interpersonal skills, team player, commitment, result oriented, hard working with a quest and zeal to learn new technologies and undertake challenging tasks

TECHNICAL SKILLS:

Job Function: Project Management, Team Lead, Requirement Gathering, Analysis, Design, Development, Testing, Release, Support, Tech documentationProgramming Korn Shell Scripting, Perl Shell Scripting, Python Shell Scripting, Java Scripting, Autosys 4.5 and R11, Crontab, XML, Hadoop - Map Reduce, Hive, Impala, Pig, Apache Sqoop, SSIS

Operating System: Solaris, Linux, HP UX, Confidential AIX, Windows 7/Vista/XP/ 2003/2000/95

Databases: Oracle 7/9i/10g/11g, Sql*plus, PL/SQL, MS Access, Confidential DB2, Teradata

ETL: Informatica Power Center 8.6,9.0,9.1,9.5, Informatica Cloud, MDM, MAV, DVO, MDX

Applications: MS Word, MS Excel, MS Access, Visio

Unix Tools: VI Editor, F-Secure SSH Client, SFTP, NDM

Reporting Tools: Business Objects and Cognos

Data Modeling tools: ER Studio, ERWIN 7, Informatica Data profiler

Database Tools: Toad, Sql Developer

Version Control: TFS, Git,CVS, SVN

PROFESSIONAL EXPERIENCE:

Confidential, Rosemont, IL

Sr Informatica Cloud Developer

Responsibilities:

  • Provide recommendations and expertise of data integration and ETL methodologies.
  • Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis.
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Developing ETL / Master data management (MDM) processes using Informatica Power Center and Power Exchange 9.1
  • Convert extraction logic from data base technologies like Oracle, SQL server, DB2
  • Complete technical documentation to ensure system is fully documented
  • Data cleansing prior to loading it into the target system
  • Convert specifications to programs and data mapping in an ETL Informatica Cloud environment
  • Provide High Level and Low Level Architecture Solution/Design as needed
  • Dig deep into complex T-SQL Query and Stored Procedure to identify items that could be converted to Informatica Cloud ISD
  • Develop reusable integration components, transformation, logging processes.
  • Support system and user acceptance testing activities, including issue resolution.

Environment: Informatica Cloud, T SQL, Stored Procedure, Sql Server 2008, FileZilla, Windows, bat scripting, MDM \

Confidential, Madison, WI

Sr Data Modeler Consultant

Responsibilities:

  • Responsible to analyze functional requirement. Created design specification and technical specifications based on functional requirements.
  • Extensively worked on developing Informatica Mapplets, Mappings, Sessions, Worklets and Workflows for data loading.
  • Worked on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Sorter, Router, Sequence Generator, XML transformation etc.
  • Created mappings of initial load for all source systems, cleansed data and load it into staging area.
  • Working on Web Service Consumer transformation to gather metrological data.
  • Extensively used ETL to load data from wide range of sources such as relational database, XML, flat files (fixed-width or delimited) etc.
  • Expertise in writing Pre Post Sqls and overrides Sqls as per the requirement.
  • Extensively worked on using the PDO (Push down optimization), CDC (Change data capture) mechanism.
  • Extensively used the capabilities of Power Center such as File List, pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc.
  • Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions .
  • Identifies, researches, and resolves ETL root cause of production issues or system problems.
  • Worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs using CONTROL-M schedulers and involved in migration deployment of ETL codes from development to production environment
  • Knowledge on STAR Schemas and Snowflake Schema .
  • Responsible for creating the parameter files in order to connect to the right environment and data base Responsible for monitoring sessions. Debugged mapping of the failed session.
  • Handled Production Support for the monitoring of daily and Monthly jobs Part of the implementation team.

Environment: Informatica Power Center 9.1, Confidential DB2, Control-M, PL SQL, FileZilla, Windows, UNIX Shell Scripting

Confidential, Hillgboro, OR

Sr Informatica Consultant

Responsibilities:

  • Evaluated current system to determine requirements and enhancements to serve project objectives
  • Interpret functional requirements, create technical design, implement the necessary technology to fulfill requirements
  • Design, Develop and maintain the delivery of data and information satisfy business needs within assigned functional responsibilities.
  • Design, Develop data maintain processes and infrastructure to efficiently source and integrate multiple internal and external data sources to increase the value of the data to the organization.
  • Design high level ETL process and data flow from source system to Data Warehouse and Involved in creating ER diagrams (Logical and Physical models using Erwin) and mapping the data to the database objects.
  • Review the existing and the new physical Database objects which includes fact tables, dimensions, staging, materialized views, indexes provided by Data Modeler
  • Design, Develop and Implement complex Informatica mappings to extract data from various sources (Oracle/Teradata/flat file/xml ) using different transformations and Load it in to various target (Oracle/csv/flat-file ) data warehouse environments.
  • Constructed and Implemented database objects (DDL) and implementing queries (DML) that perform the business functions of the database models.
  • Design, Develop, Implement the ETL objects by extracting the data using Sqoop from source system to Hadoop Files system ( HDFS ).
  • Design, Develop and Implement autosys jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.
  • Design, Develop and Implement Hive tables on top of HDFS file system which is stored in avro/text format.
  • Develop and Implement Korn/Bash/perl/python shell-scripts to automate batch activities and schedule automated jobs using autosys.
  • Involved in Testing, Debugging, Data Validation and Performance Tuning of ETL process, help develop optimum solutions for data warehouse deliverables
  • Design, Develop and Implement backend oracle packages, procedures and functions to accomplish database needs.
  • Built complex PL SQL queries, stored procedures, packages, and triggers.
  • Prepared detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
  • Prepared system documentation and provided assistance and training to team members by acting as a team advisor.

Environment: Unix, Hp UX, Solaris, Korn Shell Script, Perl Scripts, crontab, autosys 11.0, Informatica PowerCenter 9.5, Informatica Cloud, Oracle PL/SQL, Teradata, ERWIN, Hadoop - Map Reduce, Hive, Impala, Pig, Sqoop, SQL Server, T-SQL

Confidential, Chicago, IL

Informatica Developer

Responsibilities:

  • Gather requirements from the business stakeholders and Blue Health Intelligence and Medical Intelligence application testers
  • Created high-level designs and architectures directions to be used as inputs to detailed designs.
  • Delivered ETL specifications and Detailed Design by developing Informatica mapping designs, Data Integration Workflows and load processes.
  • Develop and implement new jobs using Informatica workflows within the scheduling environment to support daily, monthly, quarterly, and yearly Business Intelligence needs
  • Created complex mappings using several transformations like Source Qualifier, Expression, Dynamic lookup, Aggregator, Joiner, Sequence generator, Router, Filter, Stored procedure, Sorter and update strategy transformations.
  • Involved in tuning of SQL queries by using quest central tools and manual by EXPLAIN PLAN.
  • Developed codes using Perl Shell Scripts, Unix Korn Shell Scripts which helps in main functionalities of the application
  • Design, Develop, Test and Implement Autosys jobs
  • Performed Integration, system and regression testing and reported the test results to the users with the Traceability matrix
  • Provide Technical support to the testing team members, fixed the issues as and when an error is reported
  • Provide technical solutions to the Queries or Issues of the Business Users
  • Implemented the project with Higher Quality and With Strict Timelines
  • Apply knowledge of computer capabilities, subject matter, and symbolic logic, and prepare technical documentation to guide end users

Environment: Confidential AIX, Korn Shell Scripting, FTP, SFTP, Windows Batch, XML, HTML, CGI, Informatica 9.0, Autosys 11, Oracle 11g, PL/SQL, SQL Server, T-SQL, TFS

Confidential, Stamford, CT

BI/Data Warehouse Developer

Responsibilities:

  • Totally responsible every phase of software delivery life cycle including Requirement, Analysis & Design, Build Code, Unit Test, Support system and Integration testing, Implementation phase and post implementation support
  • Works closely with Analytics and Marketing users to understand informational needs and business challenges, document those requirements, and translate into solutions
  • Designs, analyzes, develops, codes, tests, debugs and documents programming to satisfy business requirements
  • Responsible for ETL design which entails analyzing and documenting source-to-target data mapping and developing custom ETL solutions using PL/SQL and materialized vies for complex queries to load warehouse dimension, fact and summary tables from the operational data database
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 8.6
  • Responsible for developing ETL packages to load dimension, fact and summary tables where some fact table contain over 500 million rows and 168 partitions for data spanning over 14 years using PL/SQL, Materialized views and Materialized view logs. Fact and Summary tables were incrementally refreshed using PL/SQL, materialized view logs and partition exchange.
  • Designed and developed mappings using Source Qualifier, Expression, connected-Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations.
  • Thoroughly perform unit testing ensuring minimal code defects out of build phase into system test phase
  • Provides the project manager/supervisor with timely and accurate information regarding the performance of the assigned projects(s)
  • Work effectively in a demanding multi-project, multi-team environment

Environment: Solaris, Korn, Perl Shell Scripting, Informatica Power Center 7.1,Power Exchange 7.1, Oracle (RAC) 10g, Confidential DB2, PL/SQL, SQL, SQL Loader, HTML, XML, Crontab, Autosys 4.5 and Autosys R11, Control M

We'd love your feedback!