We provide IT Staff Augmentation Services!

Informatica Lead/developer Resume

5.00/5 (Submit Your Rating)

Cincinnati, OhiO

SUMMARY:

  • Overall 10 years of experience in System Analysis, Design and Development in the fields of data warehousing and business intelligence with a strong focus on in Informatica PowerCenter ETL development, testing and bug fixing, SQL, PL/SQL, scheduling jobs and performance tuning, Big Data, Data Analytics, Data Warehousing, Data Integration and Data Migration.
  • 7+ Years of experience in ETL Informatica, Data Modeling, Informatica MDM, Oracle and UNIX environment from end to end.
  • 2 Years of experience in Hadoop Big data and Data Analytics environment including design and development.
  • 6 months of experience in Java Environment in development environment.
  • Employed ETL Methodology for supporting Data Extraction, transformation and loading processing using Informatica Power Center.
  • Experience in full cycle of Software Development including project planning, requirements analysis and documentation, design, development, testing and implementation, integration and deployment in the domains like Supply Chain/ Pharma/HealthCare/Telecom/ Finance.
  • Fair Experience in Data Modeling, Dimensional modeling of large databases.
  • Excellent Experience in Requirements gathering, High Level Design and Low level design documents.
  • Excellent Experience in Developing Complex mappings from different types of sources and loading it into different formats.
  • Fair Experience in Informatica Testing environment before moving the code to different regions (Dev/QA/Production) environments.
  • Worked extensively with Complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Lookup, Stored Procedure, Aggregator, Update Strategy, and Normalize.
  • Experience in creating Mapplets, Push down Optimization and reusable transformations.
  • Excellent experience in Identifying the performing Bottlenecks and resolution.
  • Excellent experience in Performance tuning of Source, Target, Mappings, Sessions and System resources bottlenecks.
  • Expertise in writing, testing and implementation of the Triggers, Stored procedures, Functions and Packages at database level using PL/SQL.
  • Expertise in migration process across different environments Dev/QA/UAT/Prod and developing several shell scripts to run/stop/monitor workflows/trap errors.
  • Experience in working with Informatica MDM, Business Objects, and Hyperion Essbase.
  • Proficient in OLTP, OLAP and Data Warehousing Design concepts.
  • Excellent experience in using Hadoop ecosystems like HDFS, Map Reduce, Hive, PIG, HBase, Yarn, Sqoop, Flume, Oozie, Zookeeper, Kafka, AWS and Spark.
  • Fair experience in pulling the structured and unstructured data from different sources and loading the data into HDFS by using Sqoop and Flume and processing by applying different process logic tools like Map Reducer, Hive and PIG.
  • Experience in creating Hiver Internal/External tables and views using shared meta store, writing scripts in HiveQL, data transformation & file processing using hql Scripts.
  • Experienced in creating Oozie workflows, coordinators for data ingestion and downstream processing.
  • Capable for processing large sets of structured, semi - structured and unstructured data and supporting systems application architecture.
  • Experience in setting up Single Node and Multi Nod Hadoop Cluster efficiently.
  • Hands on experience in installing, configuring Hadoop and using eco-system components HDFS, HBase, Pig, Flume, Hive, Sqoop and Map-Reduce.
  • Fair Working Knowledge in Core Java environment.
  • Implemented business logic by writing UDFs in Java.
  • Experienced in writing creating Java functions and registering them in Hive databases to be used in Hive HQL's.
  • Good Knowledge on Cassandra and Mongo DB programming and Setup.
  • Strong Experience in creating statistic models and data analysis using R Programming.
  • In depth Knowledge of Data algorithms likes top-n, sentiment analysis, naive Bayes, k-mean clustering algorithms for data analytics.
  • In depth knowledge of building Machine Learning algorithms like Linear Regression, Predictive Analysis, Decision Trees, logistic Regression, regularization, neural networks.

TECHNICAL SKILLS:

ETL Tools: Informatica 9.1, Informatica MDM

Programming Languages: C, C++, Unix, Shell Script, Java, Python

Hadoop Eco Systems: HDFS, Hive, HBase, Spark, Kafka, Flume, Avro, Sqoop, Pig, Oozie, Zookeeper, HDFS, YARN, R, Mahout, Map Reduce, Cassandra, MongoDB

MS: SQL Server 2005/2008/2012, Oracle 9i/10g, Microsoft Access, Netezza

OLAP Tools: Business Objects

Reporting Tools: Hyperion Essbase

Database Tools: Hadoop, PIG, Hive, Oracle, MySQL, MongoDB, Cassandra

Operating Systems: Windows 98, Windows 2000, Window Server 2003, Windows XP, Linux, Unix, Windows XP

Core Skills: Requirements gathering, Design and redesign techniques, Database Modeling, Development, Testing and Performance tuning

PROFESSIONAL EXPERIENCE:

Confidential, Cincinnati, Ohio

Informatica Lead/Developer

Responsibilities:

  • Analyzing, coding, testing, and assisting with user acceptance testing, production implementation and system support for the Enterprise Data Warehouse Application.
  • Responsible for full life cycle development including gathering requirements.
  • Analyzing, coding, testing, and assisting with user acceptance testing, production implementation and system support for the Enterprise Data Warehouse Application.
  • Part of interviewing, gathering, documenting requirements and determining project scope from Users; analyzing user requirements to create system designs, either coding all or sharing in the coding and development, unit testing, system testing, final implementation and post implementation monitoring.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, SQL, Union, Lookup, Joiner, XML Source Qualifier, Unconnected lookup transformations.
  • Involved in Debugging and Troubleshooting Informatica mappings.
  • Populated error tables as part of the ETL process to capture the records that failed the migration.
  • Used Informatica Power Center 9.6.0/9.1.1/8.6.2 for extraction, loading and transformation (ETL) of data in the data warehouse.
  • Communicating with Onshore team for the issue clarification, status updates etc.
  • Implemented various Data Transformations for Slowly Changing Dimensions.
  • Working on the changes in the Informatica Mappings and testing them to Analyze and Fix gaps in Daily Jobs related to ERP project.
  • Respond and resolve ad hoc requests from application and business users. Created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 9.1.0, OBIEE 10.1.3.4, SQL Server 2012, MS Excel, Windows XP/7 Windows Server 2008/2012

Confidential, Cincinnati, Ohio

Hadoop Data Analyst/Developer

Responsibilities:

  • Design and migration of existing RAN MSBI system to Hadoop.
  • Designed workflows and coordinators in Oozie to automate and parallelize Hive, Shell, Java and Pig Jobs on Apache Hadoop environment by Horton works (HDP 2.2).
  • Designed the control tables/Job tables in HBase and MySql. Created external Hive tables on HBase.
  • Experience in developing batch processing framework to ingest data into HDFS, Hive and HBase.
  • Worked on Hive and Pig extensively to analyze network data.
  • Automation of data pulls from SQL Server to Hadoop eco system vis SQOOP.
  • Performance Tuning Hive and Pig Jobs performance parameters along with native parameters to avoid excessive disk spills, enable temp file compression between jobs in the data pipeline to handle production size data in a multi-tenant cluster environment (Ambari views).
  • Hands on writing complex Hive queries involving external dynamic partitioned on date Hive Tables which stores rolling window time-period user viewing history.
  • Experience of performance tuning in hive scripts, pig scripts, MR jobs in production environment by altering job parameters.
  • Experience in working with Horton works on real time issues and bringing them to closure.
  • Used Apache Kafka for importing real time network log data into HDFS.
  • Deployed and configured Flume agents to stream log events into HDFS for analysis.
  • Worked on Hive UDFs to implement custom hive and pig capabilities in Java.
  • Developed shell scripts to automate routine tasks.
  • Load the data into Hive tables using Hive HQL's along with de duplication and windowing.
  • Generated ad-hoc reports using Hive to validate customer viewing history and debug issues in production.
  • Worked on HCatalog which allows PIG and Map Reduce to take advantage of the SerDE data format transformation definitions that we write for HIVE.
  • Created Statistical data modeling, machine learning and sentiment analysis using R and Hadoop.
  • Installed and configured various components of Hadoop ecosystem and maintained their integrity
  • Designed, configured and managed the backup and disaster recovery for HDFS data.
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand Hadoop cluster.
  • Worked with BI teams in generating the reports in Tableau.
  • Worked with Java development teams in the data coordination.

Environment: HDP 2.1, Hadoop, Map-Reduce, Hive, HBase, Oozie, Sqoop, Kafka, Spark, Flume, Pig, Java

Confidential

Java Developer

Responsibilities:

  • Prepared HLD's and LLD's based on the requirement.
  • Role in this project is as a Developer.
  • Coding using Struts framework, JSP, Java Servlets.
  • Interact to client meetings and checking test cases.
  • Preparing the LLD, Bug fixing at ST, and UAT stages.
  • Involved in resolving the Production Issues.

Environment: Java, Servlets, JSP, Struts, Hibernate, SQL Server 2008, JBoss

Confidential

Informatica MDM Tester

Responsibilities:

  • Involved in Extraction, Transformation and Loading of data.
  • Involved in requirement gathering.
  • Involved in designing HLD and LLD for 4 milestones.
  • Has got working knowledge in developing the mappings.
  • Completed the end to end testing for entire engagement without any escalations.
  • Got appreciations from the business for our on time deliverables.
  • Involved in developing the Python Scripts.

Environment: Informatica MDM 9.1, Oracle 10g, Python 3.1

Confidential

ETL Developer

Responsibilities:

  • Worked with the business users to get the business rule modifications in development and testing phases and analyzed claims data through rigorous evaluation methodology.
  • Wrote PL/SQL, stored procedures & triggers for implementing business rules and transformations.
  • Worked on Informatica tool Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.
  • Developing ETL mappings to extract data from the Oracle and load it into the Netezza database.
  • Contributed to Performance Tuning of the mappings/sessions/workflows
  • Communicating with Onshore team for the issue clarification, status updates etc;
  • Handling complex Informatica issues faced by team.
  • Conducting and participating in the Peer review of the project deliverables.
  • Puts sincere efforts to achieve the deadline; communicates in case of any issues \ roadblocks helped\mentored various new joiners in project both on technical\functional front.
  • Worked on Data Cleansing Processes.
  • Documented the Informatica Mappings and workflow process, error handling of ETL procedures
  • Extensively worked on Performance Tuning using various components like parameter files, variables and dynamic cache.
  • Used Business Objects for reporting. Interacted with Users for analyzing various Reports.
  • Verify the logs to confirm all the relevant jobs are completed successfully and timely and involved in production support to resolve the production issues.

Environment: Informatica Power center 8.1/8.6, Microsoft SQL Server 7, SQL, Oracle10g

Confidential

Informatica Developer

Responsibilities:

  • Did extensive analysis on the business requirements and gathered information for the development of several small applications.
  • Developing ETL mappings to extract data from the Oracle and load it into the Netezza database.
  • Do the error validation of the data moving from Oracle to the Netezza database.
  • Test the mappings and check the quality of the deliverables.
  • Communicating with Onshore team for the issue clarification, status updates etc;
  • Conducting and participating in the Peer review of the project deliverables.
  • Bug Analysis and fixing.
  • Contributed to Performance Tuning of the mappings/sessions/workflows.

Environment: Informatica Power Center 7.6, SQL Server 2005, Oracle 9g, TOAD, SQL Plus and Windows XP

Confidential

Informatica Developer

Responsibilities:

  • Developing Mapping/Mapplets/Workflows involving various ETL processes using Informatica tools for Concurrent workflows project.
  • Configuring concurrent workflows for all data marts.
  • Bug Analysis and Bug fixing when running the concurrent workflows.
  • Communicating with Onshore team for the issue clarification, status updates etc.
  • Conducting and Participating Peer Code review of the project deliverables.
  • Contributed to Performance Tuning of the mappings/sessions/workflows.

Environment: Informatica 8.1, Oracle, UNIX

Confidential

Production Support Engineer

Responsibilities:

  • Daily loads and handling production failure issues for Alizes application.
  • Checking for the Data Quality on a daily basis.
  • Participating in daily Onsite/Offshore co-ordination call meeting.
  • Production Monitoring and Support.

Environment: Informatica 8.1, Oracle, UNIX

We'd love your feedback!