We provide IT Staff Augmentation Services!

Datastage Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY

Technical Skills:

ETL: IBM Infosphere Datastage 8.1/7.5.X2 (Server /PX)
IBM Quality stage 8.1, IBM Information Analyzer 8.1
Informatica 8.5/8.1/7.1
OLAP: Congnos 8.1, Business Objects XIR2, SAS 9, OBIEE
Data Modeling: Erwin 7/4.5, MS Visio, Embarcadero ERStudio
Database: Oracle 10G,UDB/DB2, SQL Server
Languages: C, C++, JAVA, SQL, PL/SQL, HTML
Operating Systems: AIX 5.3, Solaris 10, Linux, Window XP
Scheduling: Autosys, UC4
Knowledge on: Siebel 8.0(SFA), SAP CRM 5.0(Service Agreements)
Oracle EBS 11.5.1 TCA (Order Management, Finance)
SAP R/3 ABAP 4.7(IDOC, BAPI), BW 3.5
SAP Netweaver XI/PI 7.0

Professional Summery:

IT Experience: Over 9 Years of experience in Information Management end-to-end Data Warehousing, Business Intelligence, Data Integration & Data Migration. Developing Data Warehouse roadmap, strategy and architecture, enterprise data warehouse, ODS, dimensional data marts and end user reports for Healthcare, Manufacturing, Retail, Pharmaceutical, Financial, Insurance.

Data Transformation: Over 5 Years of IBM Infosphere Data stage 8.0.1 / 7.5.2 / 6.0 (Administrator, Designer, Director and Manager) both SERVER & Parallel Extender/Orchestrate and Multi Clint Manager (7.5.2 to 8.0.1). Informatica Power center 7.2/8.1/8.5

Data Cleansing & Standardization: Over a years of Data Cleansing experience (Deduplication, Relationships, Address Validation, Identified, Standardized, Matched, Reconciled) using IBM Datastage, Quality Stage.

Data Profiling: 1 Years of Data profiling experience on IBM Information Analyzer 8.0.1/ Profilestage (validates data values and column/table relationships, source to target field mappings, source system profiling and analysis ) join with SME & Data modeler.

SAP Knowledge: SAP ECC & R/3 ABAP 4.7 (LSMW, BAPI, IDOC), SAP BW 3.5, SAP Netweaver XI 3.0/7.0. SAP CRM 5.0

Siebel Knowledge: Siebel SFA, Supply Chain, Service & Call Centre (EIM & Base Tables).

Data Integration Knowledge: IBM WBI Message Broker (MQ, JMS, AS1, AS2, XML, SOAP, WSDL), IBM Websphere TX, B2B Gateway, Agent to Agent.

SDLC Models: Business System Analysts, Designers and Developers to adopt processes based on Agile/Scrum development techniques.

Education:

  • MS Business Information Systems

Professional Experience:

Confidential, Oct 2011- Present.
Confidential,Franklin Lakes, NJ

Confidential,has been utilizing SAP application technology on the whole for the past 6-7 years. While SAP is the primary technology enabler within its application arsenal, it is not used by all its geographic regions around the world. Similarly, those regions and business segments that are using SAP are not all on the same release level or business models evidenced by the fact that Medical Products and Diagnostics are supported by R/3 4.0b and Biosciences operates R/3 4.7c. Obviously, this has precluded the Company from improving its operational efficiencies and applying common and standardized processes throughout its organization. Migrate data from legacy system to ECC 6.0.

Responsibilities:
• Work with the SAP ABAP team to fetch the legacy data files.
• Design the extract jobs using SAP PACKS (ABAP Extract, IDOC Extract, and BAPI)
• Design the Datastage ETL jobs applying the business rules provided by SAP process
team.
• Desing the quality jobs for de-duplication process for SAP partners and Vendors.
• Desing the jobs for loading data in Legacy tables and LSMW tables.
• Modify the oracle procedure those are help for cognos error and data validation reports.
• Used Investigate, Standardize, Match & Survive stages in Quality Stage to Harmonize
and align the data to create a single view of Business Partner and Vendor system.
• Used Domain preprocessor & Domain specific rule sets like USPREP, USNAME,
USADDR, USAREA to standardize, Matched, Deduped and Survived Business Partner
and Vendor Master using Custom rule sets in Quality Stage.
• Design the sequence jobs for SFTP process to generate LSMW file.
• Desing delta jobs fetch the data from staging and load in to legacy tables.
• Export the .dsx file move the develop code in to SVN.
• Work with the off-shore team as part of development process.

Environment: IBM Information Server 8.1 ( Datastage PX,Qaulitystage, Information Analyzer , SAP PACKS), SAP R/3 4.0b, 4.7, ECC 6.0, Oracle 10g, Cognos 8.4 , Sun Solaris 10, Tortoise SVN, UC4.

Confidential, June2011 – Oct 2011
Confidential,Trenton, NJ
DataStage Developer

The goals of this project are to analyze, restructure, load and maintain State of New Jersey data files and structures currently residing on legacy mainframe systems accessed with the Focus language and databases to the EDW. This project will make that same data available to approved NJ employees with the Business Objects reporting tool. An interface and semantic layer will enable the users the same level of data access for reports and downloads to various other data systems.

Responsibilities:
• Understand the business and technical documents of the project.
• Prepare the ETL column level mapping from various source systems to target database
System, Apply the transformation logic while building the mapping documents.
• Design the datastage jobs extract data from ODS systems and load in to staging and
later write in to dimension tables.
• Design the daily incremental datastage jobs to compare the old and new records using
CDC / Delta process.
• Write the custom before/after and transformation routines.
• Design sequence jobs and setup mail notification or setup notification individual job
Level when we get rejects.
• Improve the ETL job performance by doing query optimization and tuning of datastage
Jobs.
• Design the reusable shared container for project level.
• Prepare the unit test cases and validate the data based on business rules.
• Import and Export developed and executable ETL job components to maintain
Production backup and version Control.
• Analyze the source data files coming from Mainframes.
• Modify the COBOL File Definition PIC clause, data types to read the EBCDIC files.
• Prepare the source to target mapping document, include the transformation rules.
• Design the ETL jobs to read Mainframe files and write in to Staging, Dimension tables.
• Follow the best practices while designing the Data Stage jobs using naming standards.
• Design and develop ETL jobs using DataStage 8.5 for new projects and Datastage 7.5
for existing projects.
• Design the Audit, Error log and control tables to maintain job statics for Users.
• Preparing the unit test document by validating the data in a tables.

Environment: IBM Information Server 8.5 / 7.5(DataStage, QualityStage & Information Analyzer), Business Objects XI R3, Oracle 10g, AIX 5.3, Focus, Adabas, Erwin and TOAD.

Confidential, Oct’2010 – June’2011
PeopleSoft EPM (ETL/BI Analyst)

The Statewide Integrated Financial Tools (SWIFT) Project is actively collaborating with state agencies to replace the current Minnesota Accounting and Procurement System with a PeopleSoft Enterprise Resource Planning system. SWIFT will integrate all of the administrative functions across state agencies, including financial, procurement, reporting and the current SEMA4 (human resources/payroll) system.

Responsibilities:
• Preparing the source to target data mapping for ETL design and development.
• Developed DataStage ETL jobs and Data Loader definitions in the Enterprise
Warehouse based on the client’s requirements.
• Experienced in Database programming for Data Warehouses (Schemas), proficient in
Dimensional modeling (Star Schema modeling, and Snowflake modeling).
• Follow the ETL standards, naming conventions especially for DataStage Project
Categories, Stage names, links and maintain best practices as per the existing EDW.
• Developed DataStage ETL jobs and Data Loader definitions as specified by the
Requirements of the customer.
• Reconfigured and setup the DataStage ETL jobs to go to the ODS layer of the
Enterprise Warehouse.
• Created multiple documents including requirements, scope, customization analysis
• Customize the DataStage jobs according to client business needs.
• Identified bottlenecks in the DataStage ETL and Data Loader process and tuned as
Necessary resulting in a decrease of processing time.
• Prepare the schedule process all the individual sequence jobs.
• Apply the CRC logic (hash file) to compare the transaction data coming from
peoplesoft source
systems to OWS environment.
• Validate the data and record count in OWS, OWE and MDW tables.
• Preparing the document by validate the data in Error Tables.
• Setup and configured the Enterprise Warehouse including PF Business Units, SETID.
• Created Reconciliation process (SQL scripts, Word Doc, etc.) in order to identify the
Accuracy of the results.
• Created many deliverables including Technical Project Plan, Customization Effort
Level Document, ETL Processes Definition, Enterprise Warehouse Outlined, End-to-
End Documentation, and Setting Up Source Environments, Setting up a New EPM
Environment, End to End Document, Test Scripts, SQL Scripts.

Environment: IBM Information Server 8.1 (Data Stage, Metadata Workbench, Business Glossary), PeopleSoft EPM 9.1, People Tools 8.5.11, Oracle 11g, OBIEE 10.3

Confidential,Westborough, MA July’2010 – Oct’ 2010
DataStage Developer

Responsibilities:
• Requirement analysis/documentation, developing functional and technical
Specifications, DW and ETL designing, developing detailed mapping specifications,
DFD\'s and Scheduling charts.
• Data Profiling using Information Analyzer for Column Analysis, Primary Key Analysis
And Foreign Key Analysis.
• working with Quality Stage for data cleansing, standardize, matching and survivorship
• Designed technical design specs during the design phase and developed ETL process
Flow diagrams.
• Developed scheduling charts and scheduled shell scripts, Datastage ETL jobs and
Reports using ESP.
• Move the developed components in to AllFusion Harvest
• Worked with data modeler and database administrator to implement database changes.
• Introduced restartable dataStage jobs to address batch cycle failures by redesigning non
Restartable DataStage jobs those are critical to the Batch Cycle.
• Mentored Developers by introducing Best Practices to reduce Design Complexity and
Implement the best Parallelism in DataStage PX Jobs.
• Used DataStage Director to Debug, Run and Monitor the jobs, DataStage Designer to
Import Source/Target Metadata Definitions and Export/Import DataStage Jobs and
Components
• Design and develop the new ETL job, modify the existing jobs as per the new process.
• Design datastage PX jobs that extract, integrate, aggregate, load and transform the data
into data warehouse or data mart.
• Create and reuse metadata and job components.
• Design SCM the data mart dimensional and fact tables data coming from Manugistics.
• Design the jobs using OCI/Oracle EE stage, ODBC Enterprise stage, Lookup stage,
Change Capture stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
• Using NZ_LOAD utility load the daily, weekly and monthly data in to UNIT_CONTOL
Datamart.
• Created tables, Index and modify the aggregator tables as per the requirement.
• Prepared the UNIT and SIT test case based on designed and modified jobs.
• Modify the Incremental sequencer that support to modified jobs.
• Worked with Metadata Definitions, Import and Export of Datastage jobs using
• Getting mainframe data and put at Linux box using FTP script.
• Using FTP Plug-in gets mainframe data and load into DB2 tables.
• Write a shell scripts for file watcher and file archiving process
• Work with the TJX Canadian team as part of production support.
• Defined back up recovery process for data stage projects.
• Extensively developed UNIX Shell scripts for Data Manipulation.
• Defined & implemented data stage jobs process monitoring.
• Effective in cross-functional and global environments to manage multiple tasks and
Assignments concurrently with effective communication skills.

Environment: IBM InfoSphere Information Server 8.1,DB2/UDB 9.1, Netezza 7.2, SqlServer, Sybase, Oracle EBS, JDA 7, Cognos 8.3, AIX 5.3, Hummingbird, CA ESP , Harvest, Linux, Windows XP.

Confidential,Durham, NC Feb’2009 – July’2009
Sr.ETL Consultant (DataStage Developer)

Responsibilities:
• Analyze the existing EDW prepare the mapping documents.
• Design and develop the new ETL job, modify the existing jobs.
• Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing EDW.
• Design Datastage PX jobs that extract, integrate, aggregate, load and transform the data into Datawarehouse or data mart.
• PL/SQL Packages to create daily reports in CSV format and emailing these reports to business users using UTL_FILE and UTL_SMTP built in Oracle Packages.
• PL/SQL Packages for daily summarization for Sales and Customer data. This was developed in UNIX and PL/SQL.
• PL/SQL packages for automating lot of manual queries used by business users.
• Create and reuse metadata and job components.
• Design the jobs using OCI/Oracle EE stage, Lookup stage, CDC stage, Sort, Funnel, transformer stage, Peak, Head, Tile stages.
• Implementing the SCD Type-2 using SCD stage & Change Capture stages.
• Worked with Metadata Definitions, Import and Export of Datastage jobs using
• Implemented security among data stage users and projects.
• Created crosscheck UNIX shell scripts on interface files and audit reports on data extracted and data loaded, implemented post -Execution scripts to reconcile the data.
• Setup UNIX groups and defined UNIX user profiles and assigned privileges.
• Defined back up recovery process for data stage projects.
• Defined & implemented data stage jobs process monitoring.

Environment: IBM Datastage 7.5.2/7.5.3 (PX/MVS), Sun Solaris, Cognos 8.1, Oracle 10G, SqlServer 2005, TOAD, Tortoise CVS 1.8.3, Erwin.

Confidential,Sunnyvale, CA Mar’2008 – Jan’2009
Lead Data Stage Consultant

Responsibilities:
• Analyze the existing EDW environment and find out the gaps.
• Find out the Impact analysis and cardinality changes.
• Experience in Converting the Business Logic into Technical Specifications.
• Prepare a high level and low level design documents.
• Schedule the meetings with up and down streams.
• Design and develop the new ETL job, modify the existing jobs.
• Follow the ETL standards, naming conventions especially for DataStage Project categories, Stage names, links and maintain best practices as per the existing EDW.
• Tune the Datastage jobs design level and custom SQL scripts.
• Used Investigate, Standardize, Match & Survive stages in Quality Stage to Harmonize
and align the data to create a single view of customers
• Used Domain preprocessor & Domain specific rule sets like USPREP,USNAME,
USADDR, USAREA to standardize Customer Master and Vendor Master.
Standardized, Matched, Deduped and Survived using Custom rule sets in Quality Stage.
• Used Quality Stage to developed jobs which involved converting the variable length
record to fixed length records, parsing the fields to single domain data fields, identifying
the most commonly used pattern for each field, selection of subset of records,
standardizing the data by converting each field into a most commonly used format.
• Used SAP BW PACK BW Load stage, BW Open Hub Extract Stage to pull & push data
in to SAP BW Info packages, Process Chains.
• Extensively used SAP R3 stages like IDOC LOAD, IDOC Extract, ABAP and BAPI.
• Customized the PL/SQL code as per the rules engine.
• Used PL/SQL to create Packages, Functions, and Procedure.
• Work with different internal teams and offshore team also.
• Created tables, Index and modify the aggregator tables as per the requirement.
• Prepared the unit test case based on designed and modified jobs.
• Modify the Incremental sequencer that support to modified jobs.
• Maintain the defects by using HP-Quality Center, assign the defects.
• Implemented security among data stage users and projects.
• Set up development, QA & Production environments.
• Migrated jobs from development to QA to Production environments.
• Involved in preparing FSD documentation.
• Defined production support methodologies and strategies.
• Defined back up recovery process for data stage projects.

Environment: Windows XP / Sun Solaris, IBM Datastage 7.5.x (Server) IBM Datastage, Qualitystage & Information Analyzer 8.0.1, Cognos 8.1, OBIEE, Siebel 8.0, Oracle 11i, Oracle CDH, SAP ECC 6.0 my SAP CRM 5.0, SAP Net weaver PI 7.0, TIBCO, Oracle 10g, TOAD 8.0, Erwin

Confidential,Roanoke, VA Aug’2007 – Feb’2008
Lead Data Stage Consultant

Responsibilities:
• Design the ETL jobs based on the DMD with required Tables in the Dev Environment
• Designed and developed Star Schema dimensional model.
• Developed various jobs using Datastage PX stages DB2API / DB2EE stages, Lookup stage, Datasets, Funnel, Duplicate stage, Change Capture stage, Change Apply stage, ODBC stage.
• Provided production support and customer support to the newly developed data marts and subject areas like Replenishment Stock, Inventory Reduction.
• Applying rules set using Qualitystage to maintain customer information.
• Provide the staging solutions for Data Validation and Cleansing with Quality Stage and Datastage ETL jobs.
• Load the data in to Finanacial datamart getting data from Peoplesoft GL table.
• As a DataStage consultant working with Peoplesoft EPM.
• Developed DataStage ETL jobs and Data Loader definitions as specified by the
Requirements of the customer.
• Reconfigured and setup the DataStage ETL jobs to go to the ODS layer of the
• Prepare the schedule process all the individual sequence jobs.
• Validate the data and record count in OWS, OWE and MDW tables.
• Read the supply chain data from Salesforce application.
• Tuning the Datastage jobs source level, transformation level and target load level.
• Supporting the existing jobs in Datastage 7.5.2. Using multi client manager.
• Data extraction from iseriesDB2 database, Oracle, flat files.
• Implemented Slowly Changing dimension Type- 2 concepts.
• Performance tuning of DB2 target database using explain plan (Access Plan).
• Validation testing and Unit testing using the existing AS/400 required data
• Validating and compare the source flat file data using Perl script in UNIX box.
• Scheduling the Datastage batch jobs using UC4.

Environment: IBM Infosphere Information Server 8.0, Crystal Reports, SPSS, Business Objects XIR2, PeopleSoft EPM 8.9/9.1, People Tools 8, Oracle 11g, OBIEE 10.3, JDA , Oracle 10g, Sql Server, DB2UDB 8/ 9.1, IBM BCU, Toad, AIX 5.3, Win XP Pro, UC4.

Confidential,Lexington, KY Sept’2006 – July’2007
Lead Data Stage Consultant

Responsibilities:
• Responsible for gathering business requirement from end users
• Prepare the Data Mapping Documents and pseudo code
• Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
• Reading the data from Siebel SFA, Supply Chain, and Service Modules.
• Supporting to global regions like north/south America, EMEA, Asia Pacific
• Read data from Siebel basic tables using Siebel direct plug-in.
• Write data to Siebel basic tables through EIM table using EIM plug-in.
• Designed and developed Star Schema dimensional model.
• Design and Developed various jobs using Datastage Parallel Extender stages OCI, Hashed file, Sequential file, Aggregator, Pivot and Sort.
• Implemented Slowly Changing dimension concepts.
• Worked with Metadata Definitions, Import & Export .dsx file using Data stage Manager.
• Setup UNIX groups and defined UNIX user profiles and assigned privileges.
• Defined back up recovery process for data stage projects.
• Defined & implemented data stage jobs process monitoring.
• Defined K-shell scripts for file watcher and file archiving process.
• Installed packages and patch management.
• Validation testing and Unit testing using the Siebel required data
• Primary contact for business users for UAT testing.

Environment: Ascential Datastage 7.5 ( Server/PX/MVS, Crystal Reports, Micro Strategy, BO XI R2, Oracle 10g, SQL, PL/SQL, Siebel 7.3, JD Edwards, AS/400, Toad, UNIX Shell Scripts, Sun Solaris 8.0, Win XP, VM, Clear case

Confidential,Atlanta, GA Jun’2005 – Mar’2006
Sr. ETL Consultant (IBM Data Stage)

Responsibilities:
• Identified the various DataStage jobs, PL/SQL Scripts and UNIX scripts that were
impacted and had to be designed/created and also created the technical specifications and
mapping documents for the various tasks.
• Developed and loaded data warehouse tables such as dimension, fact and aggregate tables
using IBM DataStage.
• Extensively used IBM DataStage designer to perform complex mappings based on user
specifications.
• Aggregate and Transformer stage were used to calculate the fields based on the business
requirement and date transformations were performed.
• Developed Triggers and Views for data auditing and security purposes.
• Developed UNIX scripts (Korn shell) to communicate production error messages to
appropriate support personnel and developed routines to automate the copying of files to
remote servers.
• Developed UNIX scripts for data validation, threshold checks and email reject records to
the business and ETL primaries.
• Loaded data into Teradata using DataStage, FastLoad, BTEQ, FastExport, MultiLoad, and
Korn shell scripts
• Analyzed business requirements, transformed data, and mapped source data using the
Teradata Financial Services Logical Data Model tool, from the source system to the
Teradata Physical Data Model
• Worked closely with the source team and users to validate the accuracy of the mapped
attributes
• Troubleshoot and created automatic script/SQL generators
• Maintained versions of DataStage Code and Unix Scripts using Rational ClearCase.
• Used MS Visio to illustrate process flows for documentation purposes.
• Performance tuning of the complex queries using the explain plans.
• Creation of documents for test plans and technical guides.
• Involved in S.I.T and U.A.T test case generation and support
Environment: Ascential Datastage7.5/7.0/EE, Quality Stage, Oracle9i, PL/SQL, Toad, SQL Server, PeopleSoft, DB2UDB, Teradata, AIX.

Confidential,Omaha, NE Oct’2004 – Mar’2005
Sr.ETL Consultant (IBM Data Stage)

Responsibilities:
• Involved in migration process from DEV to Test and then to PRD.
• Obtained detailed understanding of data sources, Flat files and Complex Data Schemas.
• Used Data Stage as an ETL tool to extract data from sources systems and aggregate the data and load into the DB2.
• Used IBM Datastage as an ETL to extract data from sources like Sybase, DB2, VSAM files and flat files and loaded to target DB2.
• Read and write the data using Sybase OC stage and
• Created Re-usable repository using Data stage Manager.
• Involved in multiple subject areas like Providers, Claims.
• Designed XML stages for reading XML log files for capturing data stage jobs audit data.
• Installed and configured MQ Series Plug-In and captured On Line messages.
• Developed jobs in Parallel Extender using different stages like Transformation, Aggregation, Source dataset, external filter, Row generation, Column generation and vector stage.
• Created crosscheck UNIX shell scripts on interface files and audit reports on data extracted and data loaded, implemented post -Execution scripts to reconcile the data.

Environment: Ascential DataStage 6.0/7.5.1, DB2, AS/400, Sybase, AIX, Webfocus, Clear Case, Clear Quest, and Cybermation.

Confidential,PA Nov’2003 – Sep’2004
Sr. ETL Consultant (IBM Data Stage)

Responsibilities:
• Created a prototype for PSL to ease the quarterly submissions.
• Used Ascential Datastage as an ETL to extract data from sources like Sybase, DB2, VSAM files and flat files and loaded to target Oracle.
• Implemented Oracle Warehouse Builder & Oracle Bulk Loader for staging area.
• Used lookup stage with reference to Oracle table for insert/update strategy and for updating slowly changing dimensions.
• Used Data Stage Parallel Extender for and then split the data into subsets and to load data, utilized the available processors to achieve job performance, configuration management of system resources in Orchestrate environment.
• Worked with Metadata Definitions, Import and Export of Datastage components using Datastage Manager.
• Integrate SAP using Datastage SAP R/3 Load & Extract PACK ABAP, IDOC & BAPI.
• Customized the ABAP Programs while using the ABAP Stages.
• Developed SQL scripts for data validation and testing.
• Created UNIX shell scripts using K-shell for extracting and cleaning up the data to load the data in to the target CDW and for scheduling the jobs, Email notification to capture the status of the jobs ran.
• Created Jobs in DataStage to transfer from heterogeneous data source like COBOL, fixed record flat files, CSV files, DB2, Oracle, and Text files to ORACLE 9i.
• Covert data EBCDIC to ASCI using CFF stage

Environment: Ascential Datastage 6.0/7.x(Server/PX/MVS), Web sphere, Quality Stage, Profile Stage, Meta Stage, DB2 UDB 7.0/8.0, Oracle 9.2, PL/SQL, SQL Server 2000, SAP R/3,Show Case, Erwin 4.0, Cognos, IBM AIX 4.2, Rational Clear Case and Rational Clear Quest.

Confidential,Stuttgart, Germany July’ 2002–Mar’ 2003
ETL Developer (Informatica)

Responsibilities:

  • Involved in the requirements definition and analysis in support of Data Warehousing efforts.
  • Leading efforts in designing and implementation of client information using dimension model techniques
  • Involved in developing architecture for the data warehouse.
  • Data Modeling by Erwin (done both forward and reverse engineering).
  • Involved in designing the star schema and populating the fact tables.
  • De-normalized the tables and designed the dimension and fact tables
  • Worked along with the Data Analysts for data requirements gathering, data analysis, and testing and project coordination
  • Involved in the extraction, transformation and loading of the data from various sources into the dimensions, and the fact tables in the data warehouse.
  • Involved in writing SQL, Stored procedures and debugging them.
  • Developed complex mappings using various transformations like Source Qualifier, Expression, Filter, Aggregator, Lookup, Update Strategy, Sequence generator, Joiner Transformations, Router and Normalizer Transformation.
  • CreatedMapplets and used them in many different Fact and Error table load mappings.
  • Used unconnected lookup where different expressions used the same lookup and had multiple targets and used the same logic.
  • Developed all the mappings according to the design document and mapping specs and performed testing.

Environment:
Informatica PowerCenter 6.2.2, Business Objects 5.1, Erwin4.0, DB2 7.1,Sql Server, Control-M, Sun Solaris and Windows

Confidential,Frankfurt, Germany June’ 2001–July’2002
ETL Developer (Informatica)

Responsibilities:

  • Involved in Logical and physical database design using Erwin.
  • Created the repositories and user groups using Informatica Repository Manager.
  • Analyzed the specifications and identified the source data that needs to be moved to the data warehouse.
  • Created, scheduled, and monitored the sessions and batches on the basis of run on demand, run on time, run only once using Informatica Power Center Server Manager.
  • Involved in the creation and maintenance of catalogs in the Cognos Impromptu.
  • Created list, cross tab and group reports.
  • Generated Nested Categories and Revenue Statements.
  • Performed Data retrieval, Multidimensional analysis, Power Play structure and Reporter Operations such as Sum and Average.
  • Wrote Macro to update the User Class information in the Access Manager Administrator to Provide Security.
  • Developed and customized Impromptu Reports.
  • Wrote applications in Perl to transfer files to data storage device.
  • Queried from different database tables as per the requirement.
  • Used Cognos Query ad hoc reporting tool to explore, modify and to create queries on the Web as per the Functional Owner Business requirements.
  • Created and published web reports in to the Web using IWR.

Environment:
Informatica 6.2.2/5.1, Siebel, Erwin 4.1, Oracle 9i, SQL Server 2000, MS-Access, DB2, Linux, Windows NT.

We'd love your feedback!