We provide IT Staff Augmentation Services!

Data Architect Resume

2.00/5 (Submit Your Rating)

Mahwah, NJ

SUMMARY

  • Over 9 years of IT experience in software analysis, design and development for various software applications in client - server environment with expertise in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems.
  • Over 8 years of experience in (ETL) INFORMATICA (Power Center, Power Exchange 9.0/8.x/7.x/6.x/5.x).
  • Currently working on a project as a Data modeler/Lead Informatica Developer for a team of 10 members.
  • Extensive experience on SQL, PL/SQL in creating Views, functions, Triggers and Stored Procedures in Data Warehouse environments that employ Relational databases like Oracle, DB2 and SQL server.
  • Experienced working on packages, procedures, indexing and query tuning.
  • Proficient in designing ETL architectures for DW, MDM, Data Cleansing & Data profiling.
  • Expertise in Creating and Working with the Relational and Dimensional Data Modeling, maintaining the Star and Snow Flack Schema’s and Experience in EAI/SOA software delivery projects.
  • Solid Experience in Data Modeling, E.F.Codd’s principles of Normalization, 3 NF and BCNF Normal forms. Kimball /Inmon Methodology of dimensional Modeling.
  • Good Experience on OLAP tools like Cognos 8.0/7.2/7.1/7.0/6 , Cognos ReportNet 1.0/1.1 MR2/MR3, Cognos FrameWork Manager and Business Objects XI/6.0
  • Used Cognos IWR service called the "CognosAuditData" to read from the Session Logs created in the server and fetches the relevant data for each report and loads them onto the content database tables.
  • Expertise in Creating Current State and Future State diagrams.
  • Experience in Informatica performance tuning methods and Data Quality.
  • Very good exposure to On demand integrating data quality (IDQ), IDE in to ETL.
  • Development and Production support on ETL maps using Informatica for SAP -ERP systems and SAP BW/BI.
  • Extensive Experience in Loading the data from Flate file, XML, Oracle, Teradata 12.0, DB2 to SQL Server, Oracle as the Target with Informatica.
  • Proficient in Data Modeling using Erwin tool for creating physical and logical data models. Managed ODBC connections almost all the projects.
  • Solid 4 Years of experience in Data Modeling, E.F.Codd’s principles of Normalization, 3 NF and BCNF Normal forms and Kimball /Inmon Methodology of dimensional Modeling.
  • Experience in creating Data mart architecture, design and Entity Relationship Diagrams(ER Diagrams).
  • Expertise with the OLAP, OLTP Systems and multi-tier Boolean logic.
  • Experience in developing both warehouse ETL and OLAP application code is required
  • Worked on Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).
  • Experience/Exposure to Unix Installation, Business Intelligence tools such as Business Objects XI, Cognos and Micro strategy.
  • Highly experienced in integrating Heterogeneous data sources like Oracle, COBOL, SQL Server, DB2, Flat Files (Unstructured Data Objects) and Mainframes.
  • Experience working in large team environment including, designing Table Load specification for Target tables using Source tables, Load instructions, table type and Logical Keys.
  • Experienced working with Data Warehouse / Business Intelligence Team, implement technology improvements and monitor
  • Closely worked with Data Architects and System analysts in formulating the Table Load specification in Informatica.
  • Involved in gap analysis to compare its actual performance with its potential performance and Change Data Capture(CDC) to identify the changes.
  • Hands on experience in tuning mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings, and session.
  • Expertise in writing/debugging/enhancing UNIX Shell Scripts.
  • Expertise in creating and understanding the Star and Snowflake Schema, Data Modeling, Fact and Dimensional Tables and Slowly Changing Dimensions.
  • Good work experience in Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
  • Excellent interpersonal and analytical skills with strong ability to communicate effectively.
  • Expert knowledge of SQL, T-SQL, DTS/SSIS, Analysis Services.
  • Good Experience using the MS SQL Server 2000/2005 technology base.

TECHNICAL SKILLS

ETL Tools: Informatica 9.1/8.6.1/8.1/7.3/6.1/5.1 (Power Center, Power Connect, Power Exchange, Designer, Workflow Manager, Repository Manager), IDE/IDQ 9.0 Classic, Informatica Analyst 9.0, Informatica Developer 9.0, SSIS, Data stage.

OLAP/DSS Tools: BOXI(Designer,Desktop/webIntelligence),COGNOS10.x/8.x/7.3(Framwork Manager, Report Studio, Query Studio, Impromptu PowerPlay and Visualiser), OBIEE

Scheduling Tools: Autosys, Tidel and Maestro.

Databases: Oracle 11i Apps /10g/9i/8i, Teradata, SAP, DB2/UDB Mainframe, SQL Server, COBOL, Sybase, Seibel.

Testing Tools: Win Runner, QTP, Quality Center and Load Runner

Web Technologies: JSP, J2EE,VBScript, HTML/DHTML, XSD, XML, MQ Services

Modeling Tools: Erwin, Visio, Rational Rose, Rational Clear Quest.

Technologies: Java, C, C++, .Net, UNIX, LINUX, Shell script, SQL, PL/SQL, Toad, TSQL and Perl.

DBTools/Design: TOAD, SQL* Plus, SQL* Loader, UML, RUP

PROFESSIONAL EXPERIENCE

Confidential, Mahwah, NJ

Data Architect

Responsibilities:

  • Responsible for gather and understanding the user requirements from the reporting prospective.
  • Modified and creating couple of data model’s based on the user requirements after doing a Source and Target Analysis
  • Created the Dimensional Data Model (Star Schema and Snowflack schemas) as per the Business Need’s and worked extensively in developing the model in Cognos Framwork Manager .
  • Extensively worked on Designing and creating the ETL(Informatica) Mapping to get the corresponding information from ORACLE ERP System to Datawarehouse and from DW to Data Mart.
  • Prepared the Detail Design and Data Mapping Documents after reviewing through the Technical and Functional Document provided by the Client through BA.
  • Worked Extensively in developing the ETL(Informatica) Mapping and assigning the task to the offshore team as per the Detail Design and Data Mapping Document.
  • Used Teradata Fast Load for Multi-sessioned parallel load utility for initial table load in bulk mode on a Teradata Database and to load more than one table, multiple jobs need to be submitted.
  • Used the Teradata Multi Load for high-speed batch creation and maintenance of large databases and for command-driven parallel load utility for high-volume batch maintenance on multiple tables and views of the Teradata Database.
  • Used BTEQ for Teradata Query’s and Teradata Parallel Transporter.
  • Working Extensively on ETL (Informatica) Performance tunning activity and achieved significant amount of improvement in the Daily, Weekly, Monthly and yearly workflow’s .
  • Used Cognos Framwork Manager to add couple of tables in each of the model and defined the Logical relationship with in the model and publishing the packages.
  • Responsible for leading the Offshore team, assigning the task and conducting status meeting on a daily basis to meet the estimated deadlines.
  • Working in Agile Environment and Coordinating with the Offshore Team on a Daily basis.
  • Working closely with the DBA in the process of improving the Database performance and Utilization.
  • Responsible for Upgrading /Installation and Configuration of Informatica Power Center 9.1

Technology Used: Informatica Power Center 8.6/9.1, Oracle 11i, Teradata, BTEQ, COBOL, J2EE, Cognos 8.0, Quality Center, Erwin 7.1,UML, PL/SQL,.Net 3.0, MS Outlook 2010, Visio 2010, MS Office 2010, Windows Xp, Oracle Apps 11i, putty. Unix,Agile Methodology.

Confidential

Data Architect/Sr.Informatica Developer/ Analyst

Responsibilities:

  • Responsible for gather the user requirements and discussing with Business Analysts to acquire Data requirements.
  • Responsible for Leading the Onsite/Offshore Team by dividing the task’s within the team and make sure that the work is running parallel and in an effective way.
  • Responsible for designing the Data Dictionary for each of the application and discussion with the client in identifying the critical tables in each of the application and preparing the plan/design activity for profiling the tables in each of the application.
  • Used IDE(Informatica Data Explorer) to work extensively on Column Profiling as well as Cross table profiling to identify the quality of the data in each of the table as well as the referential integrity constraints against the tables with in as well as with the other applications.
  • Responsible for defining the Technical validation and discussing with the business to define the business validations and generating the report along with creating the reference data for Data cleansing.
  • Use Informatica Data Quality (IDQ) Workbench to export and import plans to and from local repository.
  • Used IDQ to copy local files to the service domain with the File Manager and generating Score Cards.
  • Responsible for bringing the Marketing data of GenRe from Different Applications to Landing Zone(Oracle) and loading the data in to Salesforce (SFDC) which is our SAAS service and once we work on visits, Mailing list, company Info on the marketing data we are going to extract that data from SFDC and load the data in to our Landing Zone(Oracle) and generate the reports with the help of BO XI as our reporting Tool.
  • Worked with Informatica power Exchange navigator to extract the data from Mainframes (VSAM, ADABA) files by creating the data maps and using those definitions in Power Center as the sources and creating the mappings to load the data in to the staging area for profiling in Oracle.
  • Worked on Unix Shell Script to pick the data files from one server and place them on to another on a regular base.
  • Responsible for Guiding the App’s support team in the Execution and issue resolving.
  • Used Erwin 7.1 to create the Logical model from the existing Physical Data Model to identify the relationship between tables.
  • Used Data Warehouse Master Data Management (MDM) in source identification, data collection and data transformation.
  • Responsible for the Data Integrator using the Business Objects and generating Reports.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules

Technology Used: Informatica Power Center 8.6.1, Power Exchange Navigator, IDE, IDQ, Mainframes(VSAM, ADABA, DB2), Sybase, Sales Force, Netezza, SAP, PL/SQL, Seibel, SQL Server 2008,Oracle 10 g/11i, Teradata, BTEQ, COBOL, J2EE, Business Objects XI, Agile, Quality Center, Erwin 7.1,UML, Windows Xp, putty, Unix

Confidential

Sr.Informatica Developer/Data Analyst/Lead

Responsibilities:

  • Interacting with Client and Business Analysts to acquire Data requirements, Client requirements for the project.
  • Responsible for Leading the Offshore Team by guiding and assigning the task’s to them.
  • Prepared and presented a successful software development project plans along with the Project Manager with MS Project.
  • Responsible for modifying the Star Schema as per the requirement of the client and creating the Data Mapping Document for the ETL Developers.
  • Worked on both Extracting and loading the data from and to Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).
  • Responsible for Creating the Solution design to define the Business Flow, Detail Design Document for the developer’s and preparing the Sub Doc’s for the Code Migration from Dev to QA and QA to UAT and UAT to Prod.
  • Worked Extensively on bringing the relevant data from Sales force as per the requirement from the client and loading the data into the Data mart’s
  • Created the Dimensional Data Model (Star Schema) as per the Business Need’s and worked on creating the Universe as well as generating the BO Reports (Web and Deski Reports).
  • Experience with the Informatica Data Transformation module.
  • Created a generic parser using Informatica B2B DT( Data Transformation) studio that can parse any document Excel or Delimited for any customer.
  • Responsible for doing the Data profiling with IDE(Informatica Data Explorer)
  • Use Informatica Data Quality(IDQ) Workbench to export and import plans to and from local repository.
  • Used IDQ to copy local files to the service domain with the File Manager and generating the Score cards.
  • Worked with Multi-Load, Fast Load and used BTEQ for hitting the Teradata.
  • Worked on Unix Shell Script to pick the data files from one server and place them on to another on a regular base.
  • Responsible for Guiding the App’s support team in the Execution and issue resolving.
  • Used Erwin 7.1 to create the Logical and Physical Data Model.
  • Used Data Warehouse Master Data Management (MDM) in source identification, data collection and data transformation.
  • Responsible for the Data Integrator using the Business Objects and generating Reports.
  • Worked with SAP BW and Sales force as the Source and the Target and loading the Data in to ODS and to Enterprise Data warehouse
  • Responsible for Loading the Active Directory by calling the SSIS Package from Informatica.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules
  • Involved in Creating the Labels and Deployment Groups
  • Worked with Microsoft DTS for extract, transform and loading data to SQL server database.

Technology Used: Informatica Power Center 8.6.1,Netezza, Power Exchange Navigator, Business Objects XI, Cognos 8.0, IDE,IDQ, SAP BW, Oracle ERP(CRM), Quality Center, Erwin 7.1,AIX, Sales Force, Seibel, SQL Server 2008,Oracle 10 g/11i, Teradata, COBOL,SQL Management Studio 2008, PL/SQL, UML, .Net 3.0, MS Outlook 2010, Visio 2010, MS Office 2010, Windows Xp, Oracle Apps 11i, putty. Unix

Confidential

Lead Informatica Developer/Data Modeler/Architect

Responsibilities:

  • The core of the project was supporting the Brocade Enterprise Data warehouse which covers supports a vast variety of KPIs, metrics and analysis paths catering to enterprise wide finance, operations, sales, marketing business areas and to implement processes and procedures to meet the SOX Compliance.
  • Worked expensively on TSQL and creating the stored Procedures to implement the complex business Logic.
  • EDW is primarily sourced from Oracle Apps 11i, Salesforce.com and legacy systems (Deals Desk, CFT and Rugby).
  • Worked on Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).
  • Responsible for extracting the data from Sales force and load the data in to our staging Area and generate reports.
  • By integrating the Sales force saved precious time and energy. This powerful integration allowed us to create automated marketing campaigns based on important metrics, through a single, one-to-one marketing platform.
  • In addition to this there are many maintenance and software upgrade projects that are in progress in DW space
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, Worklets, re-usable transformations, etc.
  • Worked Extensively on Data Integrator with Business Object’s XI Reporting(Desktop, Desktop Intelligence and Web Intelligence) as per my discussion with the Users.
  • Involved in Data Profilling and data loading with the help of IDE/IDQ by defining the rule’s and implementing the transformation logic.
  • Worked with OBIEE for Dashboard reporting by building task from Admin.
  • Responsible for Leading the Offshore team (Informatica Developer/BO)
  • Analyzing the RFE’s, modifying the mapping based on the request, creating the CCB’s
  • Responsible for Migrating the Code from DEV to SPT to PROD and the whole SDLC
  • Responsible for setting up the expectations to the Users and defining the time lines
  • Working on creating as well as loading the data in to Star and Snowflake Schema (Fact and Dimensional Tables) by Slowly Changing Dimensions.

Technology Used: Erwin 7.1, Informatica Power Center 8.6.1, IDE, IDQ, Business Objects XI, SAP BW, Oracle ERP(CRM), OBIEE, Service Contract, AIX, XML Files, DB2, Teradata, SQL Server 2008, Oracle 10 g, COBOL,SQL Management Studio 2008,, UML, .Net 3.0, MS Outlook 2010, Visio 2010, MS Office 2010, Windows Xp, Oracle Apps 11i, putty. Unix

Confidential, Redmond, WA

Sr. Informatica Developer / Admin

Responsibilities:

  • Attending the Informatica Corporation Demo’s on Informatica 9.0.
  • Installation of Informatica Power Center 9.0,Informatica Analyst and Developer 9.0
  • Installing the EBF Patched such as EBF 5125 and EBF 5137 to the Informatica 9.0
  • Responsible for Design, implementation and defining the folder structure for the Informatica Architecture.
  • Raising Service Requests for the bugs that we found in Informatica 9.0, tracking and leading the sessions with Informatica Corporation.
  • Worked with IDE 8.6.1/9.0 Classic and IDQ 8.6.1/9.0 Classic for the Data Profiling and Data Validations.
  • Responsible for analyzing and create data quality scorecards
  • Created Mapping in Power Center and Informatica Developer 9.0 to satisfy the business need based on the results from the IDE and IDQ from the business users and scheduling the workflows.
  • Created couple of Stored Procedures to Access all the tables and views from all the databases with in the servers and loading the results in to one of the databases in the 3rd server for the Data Profiling by the business users.
  • Used this Stored Procedure Transformation in couple of Mappings and used SSRS as the Data Integrator
  • Worked with Informatica developer 9.0 and Informatica Analyst 9.0 for loading and profiling the data from various applications.

Technology Used: Erwin, Informatica Power Center 9.0, TSQL, Informatica Analyst 9.0, Informatica Developer 9.0, IDE 9.0 Classic, IDQ 9.0 Classic, XML Files, DB2, SAP, COBOL,SQL Management Studio 2008, TFS (Team Foundation Server), UML, .Net 3.0, MS Outlook 2010, Visio 2010, MS Office 2010, Windows 7, SQL Server 2008 SP1.

Confidential

ETL Lead

Responsibilities:

  • Interacted with Client and Business Analysts to acquire Data requirements, Client requirements for the project.
  • Created mapping specifications based on the analysis of source data, user requirements and business rules.
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Worked with XML,DB2, Flate files as Sources and the Target System is SQL Server.
  • Used various transformations such as Aggregator, XML, custom, Normalizer, Upgrade strategy, Router, Sorter, Source Qualifier, Filter, Expression, Look-up, Sequence Generator, Joiner, Java etc.
  • Very Good Exposure to OBIEE reporting and used Cognos as the Data Integrator
  • Used Erwin and Enterprise Architecture as the Data Modeling Tools.
  • Used BTEQ for Querying on the Teradata12.0 Data Base and worked on Teradata Parallel Transporter.
  • Involved in Oracle Partitioning along with the DBA’s
  • Used IDE and IDQ tools have its own repository to maintain Meta data and can be configured as client/server.
  • Worked closely with the execution division in generating lead files for the credit card marketing, strategic analysis and risk divisions.
  • Guided the team in software development (SDLC) and project lifecycle methodologies.
  • Strong in preparing the Data mapping by identifying the Entity relationships and Normalization up to 3rd Normal Form (OLTP).
  • Used Erwin 7.1 to create the Logical and Physical Data Model.
  • Used Cogno’s Framework Manager for metadata modeling.
  • Used ReportNet to create reports that analyze corporate data according to specific information need.

Technology Used: Erwin, INFA Power Center 8.6.1, Power Exchange, Oracle 10g, TSQL, Netezza,XML Files,DB2, Cognos 8.0, Teradata, LINUX, COBOL,SQL Developer, Toad, TFS (Team Foundation Server), UML,ODBC connections, .Net 3.0, Java (SDK) MS Outlook, Visio, MS Word, Access, Excel, Windows XP, SQL Server 2008,Mainframes.

Confidential

Lead (Informatica Developer/ Data Modeler)

Responsibilities:

  • Interacted with Client and Business Analysts to acquire Data requirements, Client requirements for the project.
  • Prepared and presented a successful software development project plans along with the Project Manager with MS Project.
  • Guided the team in software development and project lifecycle methodologies.
  • Involved in the Gap analysis, development of mapping specifications and Data changes impact for the Edchoice and CLS schemas.
  • This project is an Agile Methodology divided in to phases such as Applications, Payments and award process.
  • Used Partitioning on table, index and index-organized table to be subdivided into smaller pieces.
  • Data refining process by gathering the inputs from CLS team located in Cleveland.
  • Created the Cleveland Scholarship schema, designed the data entity relationship and integrated with minimal changes in existing Edchoice schema.
  • Strong in preparing the Data mapping by identifying the Entity relationships and Normalization up to 3rd Normal Form (OLTP).
  • Used Erwin 7.1 to create the Logical and Physical Data Model for the Cleveland Scholarship schema.
  • Worked with DBA to generate the DDL scripts for the defined data model.
  • Created a Data modeling document for there Data model, data flow reference for the Project stake holders and meta data provisioning.
  • Created a Staging table in the staging schema where I used to maintain the Relationship between Edchoice, OEDS schemas
  • Prepared and implemented the efficient Data Migration plan.Used SOA web service for loading the Data into OEDS Schema.Written some stored Procedures based on the Requirement of the Client.
  • Used Cognos Power Play 8.0 to analyze business data and create professional reports efficiently.
  • Used Metric Studio for managing organizational performance by monitoring and analyzing metrics, projects, and other performance measures at all levels of the organization.
  • Prepared the single file by extracting the multiple source files such as Teradata 12.0, Db2, Flat Files by using the Informatica with Metadata.
  • Wrote a Stored Procedure for Data Masking the SSN Values of the Parent.
  • Used various Transformations such as Customer, Stored Procedure, Java Transformation, Multiple Lookup’s, B2B, Filter, Expression, Source qualifier, Updated Strategy, Router, Joiner, Aggregator, Sequence Generator, Union, XML parser transformation.
  • Created a sequential and Concurrent Session flow in a single work flow.
  • Unit testing on Informatica Mappings at both the Data base level and application level.
  • Did the Performance tuning for all the Informatica mappings and work flows.
  • Modified the Informatica Batch Process which already exists and created some more according to the modifications I did in the Data Model as per the new business rules.

Technology Used: Erwin, INFA Power Center 8.6, Power Exchange, BTEQ, Oracle 10g, Lotus approach Data base,DB2, Teradata 12.0, TSQL. COBOL,SQL Developer, Toad, TFS (Team Foundation Server), UML,ODBC connections, .Net 3.0, Java (SDK) MS Outlook, Visio, MS Word, Access, Excel, Mainframes, Windows XP, SQL Server

Confidential

ETL Lead (Informatica Developer)

Responsibilities:

  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Worked with XML,DB2, Flate files as Sources and the Target System is SQL Server.
  • Used various transformations such as Aggregator, XML, custom, Normalizer, Upgrade strategy, Router, Sorter, Source Qualifier, Filter, Expression, Look-up, Sequence Generator, Joiner, Java etc.
  • Used BTEQ for Querying on the Teradata 11.0 Data Base.
  • Used IDE and IDQ tools have its own repository to maintain Meta data and can be configured as client/server.
  • Worked closely with the execution division in generating lead files for the credit card marketing, strategic analysis and risk divisions.
  • Worked in frame work manager to solve ambiguous relationships in metadata
  • Involved in performance tuning by optimizing the sources targets mappings and sessions in DB2 and involved in Creating Instances, database, table spaces and creating the Triggers in DB2
  • Used Data Warehouse Master Data Management(MDM) in source identification, data collection, data transformation,normalization, rule administration, error detection and correction, data consolidation,data storage, data distribution, and data governence.
  • Creating and Managing the Non Transactional Data Entityes using MDM.
  • Supporting Informatica component such as MSA, SAP, EDW, and SIPS interfaces
  • Created Workflows in the Workflow designer and executed taskssuch as Sessions, Commands, etc using Informatica Workflow manager.
  • Monitored transformation processes using Informatica Workflow monitor.
  • Extracting the SAP/R3 Sources and converting them in to Flat Files.
  • Different types of Sources such as Flat Files, Mainframe JCL, JDK, COBOL, SAP/R3, Oracle and the Target is Oracle10g.
  • Used Web Trends to integrate all of your data and create best-of-breed solutions to your distinct business challenges.
  • Configuring the Dimensions, defining the attribute Hierarchy, sorting and grouping the attributes while working with cubes and Dimensions.
  • Designed and maintained logical and physical Dimensional Data Modeling OLAP) which is a star schema using Erwin with Conformed Dimensions.
  • Working on slowly changing dimensions (Type I, Type II).
  • Designed and Developed of data validation, load processes, test cases, and error control routines using PL/SQL, SQL.
  • Created mapping specifications based on the analysis of source data, user requirements and business rules.
  • Performed Unit testing and Integration testing using Q.C and Toad on the mappings in various schemas.

Technology Used: Informatica Power Center 8.1, Java, JSP, SSIS,ODBC connections, Power Exchange, Oracle 10g, Teradata, COBOL, Toad, SQL Assistance, DB2/UDB Mainframe, SQL, PL/SQL, Windows XP.

Confidential

Sr.Informatica Developer/SSIS/Data Modeler

Responsibilities:

  • Coordinated with the client to gather the user requirements.
  • Involved in Data Modeling based on the client Requirements.
  • Designed and developed the ETL process using Informatica.
  • Extracted the source definitions from various relation sources like Oracle, db2 and from Flat files.
  • Used Teradata Fast Load for Multi-sessioned parallel load utility for initial table load in bulk mode on a Teradata Database and to load more than one table, multiple jobs need to be submitted.
  • Used the Teradata Multi Load for high-speed batch creation and maintenance of large databases and for command-driven parallel load utility for high-volume batch maintenance on multiple tables and views of the Teradata Database.
  • Used BTEQ for Teradata Query’s and Teradata Parallel Transporter.
  • Used Various Transformations in (SSIS) as conditional split, aggregate, copy column, Lookup, data conversion, multicast, OLE DB Command, unionall.
  • Masked Data about the customer’s information such as SSN address and there full Name…
  • Working with the model of Slowly changing dimensions of Type 1
  • Involved in Development, Review and testing of the new system.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules and transformations.
  • Implemented performance tuning techniques for sources, targets, mappings, sessions and SQL queries in the transformations.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Responsible for UNIT, System and Integration testing. Developed Test scripts, Test plan and Test Data.
  • Generated complex reports using the BI tool Cognos Report net.

Technology Used: Informatica Power center 8.1, SSIS, SQL Server, Teradata V2R6, DB2, Oracle 10g, BTEQ, LINUX, Mainframes, Edit plus, Mainframes, Toad, Unix and Windows XP.

Confidential

Informatica Developer

Responsibilities:

  • Coordinated with the client to gather the user requirements.
  • Involved in Data Modeling based on the client Requirements.
  • Designed and developed the ETL process using Informatica.
  • Worked with Multi-Load, Fast Load and used BTEQ for hitting the Teradata.
  • Extracted the source definitions from various relation sources like Oracle, db2, Mainframes, COBOL and from Flat files.
  • Converted ORACLE data tables into SAS data sets using SAS SQL ‘Pass through Facility’ and uploaded SAS files into ORACLE tables using SAS export procedures such as ‘DBLOAD’.
  • Developing warehouse ETL, OLAP application code and warehouse development.
  • Developed Informatica mappings/sessions/workflows for ETL processing and used Unix Shell Scripts for smooth application interfacing for Investment Management
  • Worked in the Property and Casuality in the insurance deductible the lower your premiums could be.
  • Worked in the deductible amount if the claim is paid by insurance company
  • The Company will have had to have an accident and your insurance company had to pay the claim before you would pay the deductible. Talk to your car insurance agent about which deductible is right for you.
  • Involved in Development, Review and testing of the new system.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules and transformations.
  • Implemented performance tuning techniques for sources, targets, mappings, sessions and SQL queries in the transformations.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Responsible for UNIT, System and Integration testing. Developed Test scripts, Test plan and Test Data.
  • Generated complex reports using the BI tool Cognos Report net.

Technology Used: Informatica Power center 8.0, Java, ODBC connections, Oracle 10g, DB2, UNIX, Edit plus, Toad and Windows XP.

Confidential

Data warehouse Developer

Responsibilities:

  • Collaborated with Business analysts and the DBA for requirements gathering, business analysis and designing of the data marts.
  • Worked on dimensional modeling to Design and develop STAR Schemas, identifying Fact and Dimension Tables for providing a unified view to ensure consistent decision making.
  • Worked on HIPPA Regulations such as CLAIMS OR ENCOUNTERS (837), PRIOR AUTHORIZATION AND REFERRAL (278), ELIGIBILITY INQUIRY AND RESPONSE (270,271).
  • Worked with SDLC(Slowly Changing Dimensions) Type II
  • Worked with heterogeneous sources from various channels like Oracle DBMS, SQL Server, and flat files.
  • Identified all the confirmed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Design and Development of ETL routines, using Informatica Power Center within the Informatica mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
  • Involved in writing shell scripts on UNIX for Informatica ETL tool to run the sessions.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval
  • Created Update Strategy Transformation for all mappings, which inserts, deletes and updates the rows based on the trigger information.
  • Assisted in technical documentation for various processes such as testing, UAT, requirement analysis and results.

Technology Used: Informatica Power Center 6.1, ODBC connections, Oracle 9i/10g, Erwin, UNIX and Windows XP

Confidential

Informatica Developer

Responsibilities:

  • Involved in Analysis, Requirements gathering and documenting technical specifications
  • Worked on Informatica Power Center using various kinds of transformations to implement simple and complex business logics
  • Actively involved in using heterogeneous Source data from Oracle.
  • Extensively used Source Qualifier transformation to filter data at the source level rather than at transformation level
  • Transformations used are Connected & Unconnected Lookups, Router, Expression, Aggregator, Filter, Sequence Generator, Joiner etc
  • Prepared the external pre and post session processes for the Informatica Power Center, using Unix Shell Scripting
  • Involved in preparing detailed Business Analysis documents, ETL design documents and unit test plans for the mappings

Technology Used: Informatica Power center 5.1, Oracle 9i/10, ODBC connections, TOAD, FTP, COBOL Files, DB2, Flat Files, UNIX and Windows NT/2000.

Confidential

Informatica Developer

Responsibilities:

  • Involved in the analysis phase of the project.
  • Involved in Creating the XML Schema.
  • Use case diagrams are completely designed in the design phase by me
  • Design and Development of ETL routines, using Informatica Power Center within the Informatica mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers

Technology Used: Win 98, Apache Tomcat 4.0, Oracle 8.0, Rational Rose Enterprise Edition, Java, JSP, SDK, J2EE, Tomcat.

Confidential

Software Developer

Responsibilities:

  • Involved in the analysis phase of the project.
  • Mainly responsible for the domain Personal Details.
  • Involved in all the phases of the above domain development
  • Partly involved in the design phase of Job details and leave info.
  • ODBC connectivity are mainly undertaken

Technology Used: win 98, Oracle 8.0, Java (Core), SDK.

We'd love your feedback!