We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • 8+ years of Information Technology experience in Informatica, Teradata, Oracle, PL/SQL, TalendData Integrator 6.1 (DI), Data Analysis, Design, and Development for various software applications in a client - server environment in providing Business Intelligence Solutions in Data Warehousing for Decision-making Support Systems.
  • Expert in involving business users for requirement analysis and to define business and functional specifications.
  • Experience in ErwinDatabase programming for Data Warehouses (Schemas), proficient in dimensional modeling, Star Schema, Snowflake and Hierarchy modeling.
  • Interacting with Data Management Teams for Business requirements and Problem solve.
  • Good noledge in HDFS Architecture and Cluster concepts
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica
  • Experienced in a fast Agile Development Environment including Test-Driven Development (TDD) and Scrum.
  • Tivoli Workload Scheduler Consultant with over 2 years’ experience supporting TSW / Maestro installations
  • Extensive experience in using Microsoft BI tools SSIS, SSAS, and SSRS.
  • Experience in Data Warehouse development starting from inception to implementation and ongoing support, Strong understanding of BI application design and development principals.
  • Experience in creating and understanding Data models, Data flows and Data dictionaries.
  • Good Knowledge of TalendInterface with Hadoop me.e. HDFS, PIG, HIVE
  • Integrated the data from Oracle to Salesforce (SFDC) using Informaticacloud.
  • (MapReduce, Pig, Hive, Sqoop, HBase Cloudera Manager) ETL and RDBMS.
  • Optimized data transformation processes in the Hadoop and Big Data.
  • Experience with Normalization and De-normalization processes.
  • Experience in performance testing ETL mappings.
  • Expertise in technical and business user’s teams for validation of ETL Test cases.
  • Experience working on C# and .Net frontend applications.
  • Good Technical in performance tuning, debugging, and troubleshooting within PowerCenter.
  • Experience in writing complex T-SQL queries, Sub-queries, Dynamic SQL queries etc.
  • Experience in scripting complex Stored Procedures for better Performance.
  • Experience with SQL Server Constraints (Primary key, Foreign key, Unique key, Check key.
  • Experience in applying SQL Server Indexes on database tables for high performance and Query Optimization.
  • Technical expertise in unit testing, debugging, and troubleshooting with PowerCenter.
  • Experience with Data Migration, Data Formatting and Data Validations.
  • Experience with managing users and their authorizations.
  • Highly experienced in whole cycle of DTS/SQL server integration services (SSIS … Packages (Developing, Deploying, Scheduling, Troubleshooting and monitoring) for performing Data transfers and ETL Purposes across different servers.
  • Experience in Logging, Error handling by using Event Handler, and Custom Logging for SSIS.
  • Experience with system and user defined Variables and Package and Project Configurations for SSIS packages.
  • Real time experience in Data Modeling, Star Schema, Snowflake Schema, Slowly Changing Dimensions, Fact tables, Dimension tables, Normal forms, OLAP and OLTP.
  • Experience in Informatica PowerCenter Tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor and designing Mappings, Mapplets, Reusable Transformations, Transformations, Tasks, Worklets and Workflows.
  • Expertise in creating tables, triggers,macros, views, stored procedures, functions, Packages in Teradata database.
  • Good understanding of Data warehousing, Dimension Modeling and RDBMS concepts.
  • Source data was extracted from Oracle 11i, SQL Server, flat files, COBOLsources
  • Experience with Oracle 11g and SQL server 2012.
  • Extensive experience on Informatica Power Center 9.5 /9.1 / 8.6
  • Extensive experience in Power Exchange version 8.6 and more
  • Experience in writing Complex SQL queries, stored procedures and functions using PL/SQL Programming.
  • Good Working experience in using different modules like Spring Core Container Module, Spring Application Context Module, SpringBoot, Spring MVC Framework module, Groovy/Grails Framework, Spring AOP Module, Spring ORM Module, Spring Batch in Spring Framework.
  • Proactively monitor and optimize query performance, session performance and fine tuning the mappings.
  • Experience in Collecting and transforming data from heterogeneous data sources like Transactional Databases, Flat files, MS Excel, XML, etc. into the central data warehouse.
  • Strong Quality Assurance, debugging and performance tuning skills in ETL Process and hands on experience in troubleshooting and handling production support jobs.
  • Worked in complete Software Development Life Cycle (SDLC) Implementation from Requirement gathering, analysis, data modeling, design, testing, debugging, implementation, post-implementationsupport, and maintenance.
  • Experience with Type 1, Type 2, Type 3 Dimensions.
  • Team player and self-starter with good communication skills and ability to work independently and as part of a team.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.5/9.1/8.6.0, ab initio 3, (Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager, Workflow Monitor.

BI Tools: Tableau 7/8, IBMCognos

Databases: Oracle 11g/10g/9i, DB28.x, SQL Server 2000/2005, Teradata, Netezza

Languages: SQL, PL/SQL, Shell Scripting

Operating Systems: UNIX (SOLARIS, AIX), Linux, Windows 95/98/NT/2000/XP and DOS

DB Tools: SQL Plus, SQL Loader, Toad, Power Designer, Erwin, SSIS, SSRS, SSAS

Scheduling Tools: Autosys

Web services: Web services, Axis, Jax-ws, Jax-Rs, WSDL, SOAP, REST

Others: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word, MS Visio, COSMOS).

PROFESSIONAL EXPERIENCE

Confidential, CHICAGO, IL

Senior Informatica Developer

Responsibilities:

  • Developed mappings for loading the staging tables from text files using different transformations.
  • Implemented process automation for reoccurring production support operations, advised on ways to streamline, harden, de-risk and make more consistent the current software delivery and production support processes through tasks like the following: Automated serial number (Sn) assignments for product distribution channels (pdc) in QA/Test environments.
  • Used Erwinfor Logical and Physical database modeling of the warehouse, responsible for database schemas creation based on the logical models.
  • Implemented multi-tier application provisioning in OpenStack cloud, integrating it with Puppet.
  • Creating manifests and classes in Puppet for automation.
  • Provided risk assessment and recommendations for Jenkins environment running jobs for e-commerce builds, Sauce Labs automated testing and common production support tasks.
  • dis included Config and plugin recommendations for issues like auditing of Config changes, system performance monitoring, security issues and resource utilization.
  • Replicated the Jenkins build server to a test VM using Packer, Virtual Box, Vagrant, Chef, Perlbrew and Serverspec.
  • Created Chef Cookbooks to deploy new software and plugins as well as manage deployments to the production Jenkins server.
  • Experience OpenStack Databases like MySQL, Maria DBand NOSQL-DBfor configuring the components.
  • Automation includes provisioning new servers, making sure the servers adhere to their role and maintain the desired state from a configuration perspective with Ansible.
  • Provision, automate and configure on demand resources using Chef, Ansible and HEAT.
  • Hands-on experience with Vagrant and test kitchen while testing cookbooks and playbook.
  • Participated in development of new testing technologies, implementing quality assurance and lab accreditation systems
  • Experienced with DVO, InformaticaData Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance
  • Implemented IDQ Transformation for heavy cleanup and massaging of the staging tables to eliminate the redundant data, and check the attributes details.
  • Created data stores, project, jobs, and data flows using Data Integrator
  • Implemented OpenStack vendor distributions from Red Hat
  • Created designs for integrating OpenStack with Dell hardware/software and Netapp Flexpod
  • Familiar with OpenStack concepts of user facing availability zones and administrator facing host aggregates
  • Implemented Software-Defined-Storage by integrating Ceph and Gluster to OpenStack cloud
  • Enhanced Accenture Private Cloud blueprint by mapping OpenStack components to functional blocks
  • Implemented multi-tier application provisioning in OpenStack cloud, integrating it with Chef/Puppet
  • Integrated OpenStack (Grizzly) with OpenvSwitch to create Software-Defined-Networking tenant and service provider networks and routers
  • Implemented automated local user provisioning in instances created in OpenStack cloud through Puppet manifests.
  • Involved in building the ETL architecture for Source to Target mapping to generate the target files.
  • Developed Mappings, Mapplets, Transformations and Reusable transformations by using PowerCenter Designer
  • Hand on experience in Relational Database Management System.
  • Developed several mappings in Informatica by using the transformations like Unconnected and Connected lookups, Source Qualifier, Expression, Router, Filter, Aggregator, Joiner, Update Strategy, Union, Sequence Generator, Rank, Sorter, Normalizer, Transaction Control etc.
  • Developed Workflows, Worklets and Tasks by using PowerCenter Workflow Designer.
  • Optimized Sources, Targets, Mappings, Transformations and Sessions to increase session performance
  • Extensively used mapping parameters, variables, parameter files, user defined functions in Informatica
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
  • Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor
  • Created indexes on database tables (SQL) and tuned the queries to improve the performance Performed Unit Testing and documented the results
  • Worked on Data Conversion and Data Analysis and Data Warehousing to meet EDW requirements.
  • Scheduled and monitored automated weekly jobs under UNIX environment.
  • Used parameters and variables to facilitate smooth transition between the development and production environments.
  • Unit testing of individual modules and their integration testing.
  • Debugged and sorted out the errors and problems encountered in the production environment.

Environment: Informatica Power Center 9.5.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Erwin, Control-M, SQL Server, TOAD, Oracle 11g, Data Integrator, SQL server 2008. Microsoft COSMOS, IBM Data Storage, Scope Studio, MS SQL Server 2012, BIDS SSIS, SSRS, SSAS, Visual Studio 2013

Confidential, Chicago

Teradata/ETL Developer

Responsibilities:

  • Track and communicate team velocity and sprint/release progress
  • Worked with DBA for distribution key, random distribution and Organization Key on tables.
  • Creating an External Table from a file.
  • Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
  • Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Designed the Informaticamappings based on AB Initio code.
  • Fixed the existing components by comparing the Informaticacode with Ab Initio graph.
  • Created Pentaho ELT jobs and did the performance monitoring and logging.
  • Created aggregations in data models defining the highest level of object for dimensional modeling
  • Worked in Agile project management environment.
  • Followed user inputs for agile development.
  • Experience in Big Data with deep understanding of the Hadoop Distributed File System Eco System.
  • Integrated the data from Oracle to Salesforce (SDFC) usingInformaticacloud.
  • Optimized data transformation processes in the Hadoop and Big Data.
  • Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, etc.
  • Experienced with DVO, InformaticaData Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance
  • Implemented IDQ Transformation for heavy cleanup and massaging of the staging tables to eliminate the redundant data, and check the attributes details.
  • Created data stores, project, jobs, and data flows using Data Integrator
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes TEMPeffectively.
  • MentoredInformaticadevelopers on project for development, implementation, performance tuning of mappings and code reviews.
  • Experience inInformaticaB2B Data Exchange using Unstructured, Structured Data sets.
  • Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange ofInformatica
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.
  • Converted Oracle ddl's to Netezza ddl's.
  • Created the format of the unit test documents per Netezza Framework.
  • Expert in designing and scheduling complex SSIS
  • Packages for transferring data manually from multiple data sources to SQL server.
  • Created metadata layer (reporting) for OBIEE and OBIA.
  • Modifies OBIA dashboard for HR, INV, MFG, financial, Spend Analysis and supply chain.
  • Worked on designing catalogs, categories, sub-category and user roles using Kalido MDM 9.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target.
  • Did bulk loading of Teradata table use TPump utility.
  • Validated the target data using SQL Assistant for Teradata.
  • DevelopedInformaticamappings/mapplets, sessions, Workflows for data loads and automated data loads using UNIX shell scripts.
  • Used various lookup caches like Static, Dynamic, Persistent and Non-Persistent in Lookup transformation.
  • Involved in debugging mappings, recovering sessions and developing error-handling methods.
  • Successfully migrated objects to the production environment while providing both technical and functional support.
  • Developed E-MAIL tasks to send mails to production support and operations.
  • Optimized data transformation processes in the Hadoop and Big Data.
  • Involved in unit testing and documentation of the ETL process.
  • Extensive Tableau Experience in Enterprise Environment. Experience includes technical support, troubleshooting, report design and monitoring of system usage.
  • Extensively used ETL to load data from Flat files, DB2 and Oracle into Teradata.
  • Used Informatica Designer to Extract & Transform the data from various source systems by incorporating various business rules. Also used different transformations, sessions and command tasks.
  • Created mappings using different transformations like Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Router and Update Strategy.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Used Informatica partitioning to improve data loading performance.
  • Developed shell scripts for job automation, which will generate the log file for every job.
  • Created Informatica Frame work for Audit and Error Balancing.
  • Used IBMCognos for reporting.

Environment: Informatica PowerCenter9.x, Netezza, IDQ, SDFC, SSIS, SSAS, Hadoop, Microsoft SQL-Server, Aginity workbench, DB2, Flat Files.

Confidential, Mather, CA

Sr. Informatica Developer

Responsibilities:

  • Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup (Connected & Unconnected), Source Qualifier, Filter, Update Strategy, Stored Procedure, Router, and Expression).
  • Developed number of complex Informatica Mappings, Mapplets and Reusable Transformations for different types of tests in Customer information, Monthly and Yearly Loading of Data.
  • Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Extracted data from various sources like Flat Files, SQL server and Oracle.
  • Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
  • Created Sessions, reusable Worklets and Batches in Workflow Manager and Scheduled the batches and sessions at specified frequency.
  • Monitored the sessions using Workflow Monitor.
  • Data Quality Analysis to determine cleansing requirements.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Captured data error records corrected and loaded into target system.
  • Created Mappings, Mapplets and Transformations, which remove any duplicate records in source.
  • Implemented efficient and TEMPeffective performance tuning procedures, performed benchmarking, and these sessions were used to set a baseline to measure improvements against.
  • Tuned Source System and Target System based on performance details, when source and target were optimized, sessions were run again to determine the impact of changes.

Environment: Informatica Power Center 8.5, MS SQL Server 2008, DB2, Oracle 10g, Teradata 13, Unix Shell Scripts, Toad.

Confidential, Bridgewater, NJ

Sr. Informatica Lead Developer

Responsibilities:

  • Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements.
  • Developed technical specifications of the ETL process flow.
  • Designed the Source - Target mappings and involved in designing the Selection Criteria document.
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Teradata.
  • Used Informatica PowerCenter to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).
  • Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.
  • Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
  • Implemented Informatica Framework for (dynamic parameter file generation, start, failed and succeeded emails for an integration, Error handling and Operational Metadata Logging ).
  • Implemented sending of Post-Session Email once data is loaded.
  • Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
  • Scheduled various daily and monthly ETL loads using Autosys.
  • Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Involved in unit testing and documentation of the ETL process.
  • Involved in Production Support in resolving issues and bugs.

Environment: Informatica PowerCenter 9.x, Teradata, PL/SQL, Oracle 11g, Toad 8.0, UNIX Shell Scripting, Tableau.

Confidential

ETL Developer

Responsibilities:

  • Analyzed source systems, business requirements and identified business rules for building the data warehouse.
  • Developed Technical Specifications of the ETL process flow
  • Designed and developed Informatica mappings, workflows to load data into Oracle ODS.
  • Installed and Configured Informatica Power Center client tools and connected with each database in Data Ware house using repository server.
  • Extensively used Informatica to load data from Oracle, XML and Flat Files to Oracle.
  • Used Informatica workflow manager, monitor, and repository manager to execute and monitor workflows and assign user privileges.
  • Extensively worked with Aggregator, Sorter, Router, Filter, Join, Expression, Lookup and Update Strategy, Sequence generator transformations.
  • Set up Metadata driven utility design for the ETL processes using Informatica.
  • Used debugger to test the mapping and fixed the bugs.
  • Involved in tuning the performance of sessions and mappings.
  • Used the Workflow manager to create workflows and tasks, and also created Worklets.
  • Involved in Production Support in resolving issues and bugs.
  • Worked on SQL stored procedures, functions and packages in Oracle.
  • Scheduled and executed batch and session jobs on Autosys.
  • Created and maintained UNIX shell scripts for pre/post session operations and various day-to-day operations.
  • Developed unit and system test cases, using System Procedures to check data consistency with adherence to the data model defined.

Environment: Informatica Power Center 9.x, Oracle8i, PL/SQL, SQL

Confidential

ETL Developer

Responsibilities:

  • Responsible for design, development, enhancement of Informatica mappings.
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, transformations.
  • Extensively used Informatica Server Manager to create the workflows, batches and scheduled them the mappings based on the user requirement.
  • Creation of Transformations like Lookup and Source Qualifier Transformations in the Informatica Designer.
  • Created various other Transformations like Aggregate, Expression, Filter, Update Strategy, Stored Procedures, and Router etc. and Fine-tuned the mappings for optimal performance.
  • Involved in creation of Stored Procedures and tested in development and staging environment
  • Tuning Informatica Mappings and Sessions for optimum performance.

Environment: Informatica Power center 8.6, Oracle, SQL, PL/SQL, and Windows XP

We'd love your feedback!