We provide IT Staff Augmentation Services!

Informatica Developer Resume

0/5 (Submit Your Rating)

Wilmington, DE

SUMMARY

  • 8+ years of experience in Analysis, Design, Development, Testing, Implementation, Enhancement and Support of ETL applications which includes strong experience in OLTP & OLAP environments as a Data Warehouse/Business Intelligence Consultant.
  • 4+ years of experience in Talend Open Studio (6.x/5.x) for Data Integration, Data Quality and Big Data.
  • Experience in working with Data Warehousing Concepts like OLAP, OLTP, Star Schema, Snow Flake Schema, Logical Data Modeling, Physical Modeling and Dimension DataModeling.
  • 2+ years of experience in Expertise on Talend DataIntegration suite and Bigdata Integration Suite for Design and development of ETL/Bigdata code and Mappings for Enterprise DWH ETL Talend Projects.
  • Widespread experience in using Talend features such as context variables, triggers, connectors for Database and flat files.
  • Hands on Involvement on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Experienced in ETL Talend Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
  • Tracking Daily Dataload, Monthly Data extracts and send to client for their verification.
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehousing using ETL Tools.
  • Excellent understanding and best practice of DataWarehousing Concepts, involved in Full Development life cycle of Data Warehousing.
  • Experienced in analyzing, designing and developing ETL strategies and processes, writing ETL specifications.
  • Involved in extracting user's Data from various Datasources into Hadoop Distributed File Systems (HDFS).
  • Experience with MapReduce, Pig, Programming Model, Installation and Configuration of Hadoop, HBase, Hive, Pig, Sqoop and Flume using Linux commands.
  • Experienced in using Talend Data Fabric tools (Talend DI, Talend MDM, Talend DQ, Talend Data Preparation, ESB, TAC)
  • Experienced in working with different data sources like Flat files, Spreadsheet files, log files and Databases.
  • Knowledge in Data Flow Diagrams, Process Models, E - R diagrams with modeling tools like ERwin & ERStudio.
  • Experience in AWSS3, RDS (MySQL) and Redshift cluster configuration.
  • Extensive experience in J2EE platform including, developing both front end & back end applications using Java, Servlets, JSP, EJB, AJAX, Spring, Struts, Hibernate, JAXB, JMS, JDBC, Web Services.
  • Strong Understanding of Data Modeling (Relational, dimensional, star and snowflake schema) Data analysis implementation of Data Warehouse using Widows and Unix.
  • Extensive Experience in, Functions, Developing Stored Producers Views and Triggers, complex queries using SQL Server.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Worked in all phases of BW/BI full life cycles includingAnalysis, Design, Development, Testing, Deployment, Post-Production Support/Maintenance, Documentation and End-User Training.
  • Highly Proficient in Agile, Test Driven, Iterative, Scrum and Waterfall software development life cycle.
  • Highly motivated with the ability to work effectively in teams as well as independently.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Ability to grasp and apply new concepts quickly and effectively.

TECHNICAL SKILLS

Data Warehousing: Talend Open Studio 6.2.1/6.1.1/5.5, Informatica Power Center 9.x/8.x/7.x

Databases: Teradata, Oracle 11g/10g/9i, MS SQL Server 2012/2008, Sybase, MS Access.

ETL Tools: Talend Enterprise Edition for Data Integration, Big Data, ERwin, ER Studio, Visio, Toad, SQL Plus, Data Studio, MS Office.

Methodologies: Star Schema, Snowflake Schema, Agile, Waterfall.

Languages: SQL, T-SQL, PL/SQL, C, JAVA, UNIX Shell Scripting.

Operating Systems: Unix (AIX, HP-UX), LINUX, Windows

PROFESSIONAL EXPERIENCE

Confidential, Mooresville- NC

Sr. Talend Developer

Responsibilities:

  • Has developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend.
  • Deployed and scheduled Talend jobs in Administration console and monitoring the execution
  • Created separate branches with in the Talend repository for Development, Production and Deployment.
  • Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
  • Review requirements to help build valid and appropriate DQ rules and implement DQ Rules using Talend DI jobs.
  • Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ.
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.
  • Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing.
  • Created Generic and Repository schemas.
  • Created standard and best practices for Talend ETL components and jobs.
  • Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.
  • Worked on HIVE QL to get the data from hive database.
  • Responsible for developing data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query performance.
  • Configure Talend Administration Center (TAC) for scheduling and deployment.
  • Create and schedule Execution Plans - to create Job Flows
  • Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys.

Environment: Talend 6.2.1/6.0.1, Talend Open Studio Big Data/DQ/DI, Talend Administrator Console, Oracle 11g, Teradata V 14.0, Hive, HANA, PL/SQL, DB2, XML, JAVA. ERwin 7, UNIX Shell Scripting.

Confidential - Chicago, IL

ETL/Talend Developer

Responsibilities:

  • Worked closely with Business Analysts to review the business specifications of the project and to gather the ETL requirements.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Created and managed Source to Target mapping documents for all Facts and Dimension tables
  • Analyzing the source data to know the quality of data by using Talend DataQuality.
  • Involved in writing SQL Queries and used Joins to access Datafrom Oracle, and MySQL.
  • Assisted in migrating the existing data center into the AWSenvironment.
  • Used Talendmost used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Experienced in using debug mode of Talend to debug a job to fix errors.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
  • Developed Talend jobs to populate the claims datato datawarehouse - star schema.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis.
  • Worked Extensively on TalendAdmin Console and Schedule Jobs in Job Conductor.

Environment: Talend Enterprise Big DataEdition 5.1, Talend Administrator Console, MS SQL Server 2012/2008, Oracle 11g, Hive, HDFS, Sqoop, TOAD, UNIX Enterprise Platform for Data integration.

Confidential - San Jose, CA

Informatica / Talend Developer

Responsibilities:

  • Developed ETL process to load Oracle data to Sql Server System using following Talend Components:
  • Oracle Components - tOracleConnection, tOracleInput, tOracleBulkExec
  • Worked on linux system (Red Hat) to deploy the Talendcode.
  • Deployed the code using shell scripts in other machines.
  • Worked extensively on SQL Queries for validating the records.
  • Worked on paginating the SQL statements in the ETL flow to handle the memory issues and to improve the performance.
  • Worked on handling the dead lock errors while updating the SQL Server tables in the ETL flow.
  • Parameterized the overall work flow to execute the code in different environments.
  • Parallelized the workflows to improve the time for execution.
  • Developed ETL mappings.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Developed Informaticamappings to load data into various dimensions and fact tables from various source systems.
  • Worked on Informatica power center Designer tools like source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer
  • Worked on Informatica Power Center Workflow Manager tools like Task Designer, Workflow Designer, and Worklet Designer.
  • Worked as a key project resource taking day-to-day work direction and accepting accountability for technical aspects of development.
  • Developed the Business rules for cleansing, validating and standardization of data using Informatica Data Quality.
  • Designed and developed multiple reusable cleanse components.

Environment: Talend Open Studio 5.0.1, Informatica Power center, UNIX, SQL Server, TOAD, AutoSys.

Confidential, TX

ETL/Informatica Developer

Responsibilities:

  • Involved in the complete life cycle development of the project including Analysis, Design, Development, Testing and Production support.
  • Interacted with Business analysts to understand and convert Business Requirements to Technical Requirements.
  • Prepared High-level and Detailed Level Design Documents for Extractions, Validations, Transformations and Loading to target systems.
  • Implemented Slowly Changing Dimensions - Type I and Type II in various mappings.
  • Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential, parallel, sessions that perform Full and Incremental Loads to target system.
  • Migrating the data from different environments like Development to production for creating shortcuts and by creating deployment groups.
  • Created Jobs in Tidal scheduler to auto run Informatica workflows based on dependency checks.
  • Identified and created various test scenarios for Unit testing the data loaded in target.
  • Performed Unit testing and Integrated testing for the objects created at ETL level, with Testers as per testing standards and UAT with Users to check data consistency.

Environment: Informatica Power Center 9.1(Repository Manager, Designer, Workflow Manager, Workflow Monitor and Repository Server Admin console), Power Exchange 9.1, Informatica Developer 9.1, Oracle 12c/11g, PL/SQL, SQL, TOAD, Red Hat LINUX 5.8.

Confidential - Wilmington, DE

Informatica Developer

Responsibilities:

  • Designed and developed star schema model for target database using ERWIN Data modeling.
  • Extensively used ETL Informatica tool to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.
  • Designed and developed Mapplets for faster development, standardization and reusability purposes.
  • Implemented Slowly Changing DimensionType 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Used Debugger to validate transformations by creating break points to analyze, and monitor Data flow.
  • Worked along with the QA Team and provided production support by monitoring the processes running daily.
  • Involved in pre-and post-session migration planning for optimizing data load performance.
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations.
  • Wrote UNIX Shell Scripts and pmcmd command line utility to interact with Informatica Server from command mode.

Environment: Informatica Power Center 8.x, Informatica Repository Manager, Oracle10g/9i, DB2, ERwin, TOAD, Unix - AIX, PL/SQL, SQL Developer.

Confidential

SQL/BI Developer

Responsibilities:

  • Created database objects like views, indexes, user defined functions, triggers and stored procedures.
  • Involved in ETL process from development to testing and production environments.
  • Extracted date from various sources like Flat files, Oracle and loaded it into Target systems using Informatica 7.x.
  • Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.
  • DevelopedPL/SQL triggersandmaster tablesfor automatic creation of primary keys.
  • Used Informatica Power Center Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.
  • Tuned mappings and SQL queries for better performance and efficiency.
  • Automated existing ETL operations using Autosys.
  • Created & Ran shell scripts in UNIX environment.
  • Created tables and partitions in database Oracle.

Environment: Informatica Power Center 8.x, Oracle, SQL developer, MS Access, PL/SQL, UNIX Shell Scripting, SQL Server 2005, Windows XP.

Confidential

Database Developer

Responsibilities:

  • Responsible for requirement analysis of the application
  • ETL tool Informatica was used to load strategic source data to build the data marts.
  • An operational data store was created. Metadata Build up was designed for performing data mapping.
  • Also, involved in Mass data loads, refreshing the data in various applications, performance evaluations, modifying the existing code to accommodate new features.
  • Used various Transformations like Aggregator, Router, Expression, Source Qualifier, Filter, Lookup, Joiner, Sorter, XML Source qualifier, Stored Procedure and Update Strategy.
  • Worked extensively on Flat Files, as the datafrom various Legacy Systems are flat files.
  • Have setup Test and Production Environment for all mappings and sessions.
  • Created and configured Sessions in Workflow Manager and Server Manager.
  • Debugged the sessions using Debugger and monitored Workflows, Worklets and Tasks by Workflow Monitor.
  • Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify the mappings.
  • Developed Oracle stored procedures, packages and triggers for Datavalidations.

Environment: Informatica Power Center 8.6.1, Oracle 11g, SQLServer2008, MS Access, Windows XP, Toad, SQL developer

We'd love your feedback!