- Over 9 years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Banking, Financial and Manufacturing Sectors.
- Around 2+ years of experience with Talend open studio &Talend Enterprise platform for data management (V5.5.1 )
- Strong analytical and conceptual knowledge in database design and development using Oracle 11g/10g/9i/8.x, SQL, Postgres, PL/SQL, Developer/2000 and UNIX Shell Scripting.
- Around 1+ years of experience with Talend Admin console (TAC).
- Strong experience in Informatica tools Repository manager, Designer, Workflow manager, Workflow monitor. Ability to Work with complex mappings using different transformations like Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, and Sequence Generator
- Well acquired on Informatica Power Center 9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) and Repository Manager & Admin console.
- Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
- Have in - depth knowledge in Data analysis, Data warehousing and ETL (Extract Transform and Load) techniques and TOAD.
- Ability to communicate & present design or support ideas in a team-related environment
- Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Worklets and Workflows for data loads.
- Experience in using the Informatica command line utilities like PMCMD to execute workflows in UNIX environment.
- Experience in identifying and resolve ETL production root cause issues.
- Experience in maintenance, enhancements, performance tuning of ETL code.
- Experience in using Automation Scheduling tools like Autosys and Tivoli job scheduler.
- Worked extensively with slowly changing dimensions.
- Hands-on experience on agile methodology.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Extensive experience in Design, develop and test processes for loading initial data into a data warehouse
- Strong knowledge in Data Warehousing concepts, hands on experience using Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, SQL Assistant, Data mover), UNIX
- Extensively Worked on database objects like tables, indexes, views
- Strong Experience on RDBMS Concepts.
- Strong experience in OLAP Data Modeling using Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
- Experience in creating mappings, workflows using Informatica Power Center
- Experience in developing complex SQL queries using joins, sub queries and correlated sub queries.
- Experienced in working with business users, business analysts, IT leads and developers in analyzing business requirements and translating requirements into functional and technical design specifications.
- Knowledge on Migrations from SQL to Natezza
- Strong analytical and problem solving skills.
- Excellent communication and interpersonal skills. Ability to work effectively while working as a team member as well as individually.
Sr ETL Informatica/ Talend Developer
Confidential - Denver,CO
- Developed complex ETL jobs from various sources such as SQL server, Postgressql and other files and loaded into target databases using Talend OS ETL tool.
- Created Talend jobs using the dynamic schema feature.
- Interact with business community and gathered requirements based on changing needs. Incorporated identified factors into Talend jobs to build the Data Mart.
- Performance tuning - Using the tmap cache properties, Multi-threading and Parallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
- Involved in Preparing Detailed design and technical documents from the functional specifications.
- Prepared low level design documentation for implementing new data elements to EDW.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Have used AWS components (Amazon Web Services) - Downloading and uploading data files (with ETL) to AWS system using S3 talend components.
- Used more components in Talend and Few to be mentioned: tMap, tFilterRow, tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback, components etc. in many of my Jobs Design
- Worked on Joblets (reusable code) & Java routines in Talend.
- Design, Develop and Test ETL processes in order to meet project requirements
- Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion).
- Used to be On call Support if the Project is deployed to further Phases
- Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
Environment: Talend Open studio V (6.1.1, 6.2.1), UNIX, AWS-S3, Microsoft SQL Server management Studio, Postgres SQL, WINDOWS XP
ETL Informatica/ Talend Developer
- Created ETL/Talend jobs both design and code to process data to target databases
- Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java codes to capture global map variables and use them in the job
- Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
- Created Implicit, local and global Context variables in the job
- Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints
- Developed complex Talend ETL jobs to migrate the data from flat files to database
- Implemented custom Error handling in Talend jobs and worked on different methods of logging
- Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs
- Created Unix Scripts and run them using tSystem for reading the Data from flat files and archiving the Flat files at the specified server
- Tuned sources, targets and jobs to improve the performance
- Monitor; troubleshoot batches and jobs for weekly and monthly extracts from various data sources across all platforms to the target database
- Provided the Production Support by running the jobs and fixing the bugs
- Created mapping documents to outline data flow from sources to targets
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse
- Maintained stored definitions, transformation rules and targets definitions
Environment: Talend 5.x, XML files, DB2, Oracle 11g, SQL server 2008, SQL, UNIX Shell Scripts
ETL Informatica Developer
Confidential - Warren, NY
- Worked as part of the development team, along with the Systems Analysts/Business Analysts to design the CDM data model, and the architecture of the system
- Created multiple Type 2 mappings in the CDM for Dimensions as well as Fact tables, implementing both date based and flag based versioning logic.
- Set up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager. Generated completion messages and status reports using Workflow Manager/Workflow Monitor.
- Developed mappings for Customer, Product, Premise and Campaign dimensions, implementing the ETL logic provided as part of the Technical Specifications.
- Created logical dimensional STAR, SNOWFLAKE schemas (Kimball), Entity Relationships diagrams, Normalization, physical schemas.
- Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor
- Extensively worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer to designed, developed and tested complex mappings and mapplets to load data from external flat files and RDBMS.
- Extensively used various transformations like Source Qualifier, Joiner, Aggregators, Connected & Unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, and Update Strategy & Sequence Generator.
- Defined Derived Tables, Classes, Objects, Dimensions, Measures, Details, Conditions, Joins. Resolved Loops (Chasm Traps and Fan Traps) using Aliases and contexts and Verifying Integrity of universes.
- Used Web Intelligence / Xcelcius for web based query and reporting.
- Developed reports with parameters, sub-reports based on Stored Procedures and Queries by using Crystal reports 2008 / XI
- Upgraded Business Objects Enterprise XI R2 to XI 3.1/SAP BO
- Used SQL, PL/SQL, TOAD to validate the Data going in to the Data Ware House
- Extensively worked with the QA (Quality specialist/analyst) team to get the system and unit testing done successfully
Environment: Informatica Power Center 8.6/8.1, Oracle 10g, PL/SQL, SQL, SQL*PLUS, SQL*LOADER, SQL Server 2005, SAP BO / BOXIR3, Shell Scripting on UNIX Sun OS 10, WINDOWS.
ETL Informatica Developer
Confidential, Bethpage, NY
- Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica.
- Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Confidential, Batches and Target Data.
- Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression and Aggregator and Sequence generator.
- Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager/Workflow Manager.
- Worked with pmcmd to interact with Informatica Server from command mode and execute the shells scripts.
- Worked with different sources such as Oracle, MS SQL Server and flat files.
- Knowledge of slowly changing dimension tables and fact tables.
- Writing documentation to describe program development, logic, coding, testing, changes and corrections.
- Used control-M for Scheduling, Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts
- Optimizing the mappings by changing the logic to reduce run time.
Environment: Informatica Power Center 8.1 (ETL), Oracle 9i, SQL, PL/SQL, SQL*Loader, SQL Server, Windows, HP-Unix.
Confidential - Owings Mills, MD
- Involved in design and development of ETL process full data warehouse life cycle (SDLC)
- Coding and testing of various database objects such as views, functions and stored procedures using SQL and PL/SQL
- Involved in designing the procedures for getting the data from all the source systems to Data Warehousing system.
- Loaded all membership/Claims data as per the Business requirements by profiling the data, cleansing the raw source data, applying business transformations, massage the data, SQL Optimization and Performance tuning.
- Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.
- Performance tuning on sources, targets, mappings, sessions and SQL queries in the transformations to improve the daily load performance.
- Designed and developed complex Aggregate, Join, Router, Look up and Update strategy transformation based on the business rules.
- Created Dynamic lookup transformation to increase the session performance.
- Experience in integration of various data sources like Oracle, DB2 UDB, Teradata, SQL server and MS access into staging area.
- Used Scripting and Scheduled PMCMD to interact with Informatica Server from command mode and involved in jobs scheduling, monitoring and production support in a 24/7 environment.
- Implemented procedures/functions in PL/SQL for stored Procedure Transformations.
- Used various Oracle Index techniques to improve the query performance.
- Gained working knowledge of ControlM Scheduling Tool for loading/force starting jobs, changing job status and monitoring job progress.
- Prepared BTEQ scripts to load data from Preserve area to Staging area.
- Experience working with Rational Clear Case for Version controlling and Clear Quest for defect tracking tool.
Environment: Informatica Power Center 8.6 (Designer, Server Manager, Repository Manager), ControlM Scheduler, Tidal, SQL Server 2008, DB2 UDB, Erwin, SQL, PL/SQL, Perl Scripting, UNIX, Clear Case, UNIX Platform, Perl Scripting, Putty, Windows 2007
Confidential - Santa Ana, CA
- Extensively involved in Gathering requirements by holding meetings with users.
- Identify entities and relationships in a business environment using an event-oriented focus.
- Modeled the business process in the enterprise wide scenario.
- Constructed context diagrams and data-flow diagrams based on a description of a business process. Analyzing the data model and identification of heterogeneous data sources.
- Constructed an extended entity relationship diagram based on a narrative description of a business scenario.
- Performed testing and QA role: developed Test Plan, Test Scenarios and wrote SQL Plus Test Scripts for execution on converted data to ensure consistency.
- Development of Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies.
- Worked on Tuning database server, application logic and database objects.
- Worked on tuning SQL statements using Sybase query plan analyzer
- Managed the database objects Indexes, Triggers, procedures, functions, packages, cursors.
- Involved in Maintenance of database, monitoring logs.
Environment: Informatica PowerMart (Designer 5x, Repository Manager 5x, Server Manager 5x), SQL Server 2000, Oracle 8i, UNIX/AI
Confidential - Chandler, AZ
- Gathering required data, data analysis and documentation of the plan.
- Gathering data elements needed for analysis of the data, decoding flat file data and loading into database tables using SQL*Loader
- Responsible for data mapping from legacy system to Oracle
- Created database objects like tables, synonyms, sequences, views
- Developed various data exception reports and submitted to the client for data clean up
- Created materialized views, partitions, tables, views and indexes
- Checking alert logs on daily basis and resolving problems.
- Troubleshooting server configuration files such as tnsnames.ora and listener.ora.
- Transferring logical data using Export/Import and Expdp/Impdp utilities
- Resolving user issues like long running sessions and locks.
- Involved in tuning and optimization of SQLstatements
- Configured RMAN and performing backups regularly using RMAN tool.
- Implemented Backup strategies (Hot & Cold backup).
- Proactive monitoring of database health and work as the part of a team member and
- Created/modified the procedures, functions and packages to support data conversion
- Used SQL hints and indexes to improve the performance of queries
- Modified the existing shell scripts to support conversion process
- Checked explain plan of the SQL queries to improve the performance
Environment: Oracle 8i, TOAD, SQL* Plus, PL/SQL, Import/Export Utility and Windows NT, UNIX. Oracle 9i, SQL, PL/SQL, SQL*Loader, SQL Server, Windows, HP-Unix.