We provide IT Staff Augmentation Services!

Data Architect (data Modeling/informatica/cognos) Resume

2.00/5 (Submit Your Rating)

Mahwah, NJ

SUMMARY

  • Over 9 years of IT experience in software analysis, design and development for various software applications in client - server environment with expertise in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems.
  • Over 8 years of experience in (ETL) INFORMATICA (Power Center, Power Exchange 9.0/8.x/7.x/6.x/5.x).
  • Currently working on a project as a Data modeler/Lead Informatica Developer for a team of 10 members.
  • Extensive experience on SQL, PL/SQL in creating Views, functions, Triggers and Stored Procedures in Data Warehouse environments that employ Relational databases like Oracle, DB2 and SQL server.
  • Experienced working on packages, procedures, indexing and query tuning.
  • Proficient in designing ETL architectures for DW, MDM, Data Cleansing & Data profiling.
  • Expertise in Creating and Working with the Relational and Dimensional Data Modeling, maintaining the Star and Snow Flack Schema’s and Experience in EAI/SOA software delivery projects.
  • Solid Experience in Data Modeling, E.F.Codd’s principles of Normalization, 3 NF and BCNF Normal forms. Kimball /Inmon Methodology of dimensional Modeling.
  • Good Experience on OLAP tools like Cognos 8.0/7.2/7.1/7.0/6 , Cognos ReportNet 1.0/1.1 MR2/MR3, Cognos FrameWork Manager and Business Objects XI/6.0
  • Used Cognos IWR service called the "CognosAuditData" to read from the Session Logs created in the server and fetches the relevant data for each report and loads them onto the content database tables.
  • Expertise in Creating Current State and Future State diagrams.
  • Experience in Informatica performance tuning methods and Data Quality.
  • Very good exposure to On demand integrating data quality (IDQ), IDE in to ETL.
  • Development and Production support on ETL maps using Informatica for SAP -ERP systems and SAP BW/BI.
  • Extensive Experience in Loading the data from Flate file, XML, Oracle, Teradata 12.0, DB2 to SQL Server, Oracle as the Target with Informatica.
  • Proficient in Data Modeling using Erwin tool for creating physical and logical data models. Managed ODBC connections almost all the projects.
  • Solid 4 Years of experience in Data Modeling, E.F.Codd’s principles of Normalization, 3 NF and BCNF Normal forms and Kimball /Inmon Methodology of dimensional Modeling.
  • Experience in creating Data mart architecture, design and Entity Relationship Diagrams(ER Diagrams).
  • Expertise with the OLAP, OLTP Systems and multi-tier Boolean logic.
  • Experience in developing both warehouse ETL and OLAP application code is required
  • Worked on Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).
  • Experience/Exposure to Unix Installation, Business Intelligence tools such as Business Objects XI, Cognos and Micro strategy.
  • Highly experienced in integrating Heterogeneous data sources like Oracle, COBOL, SQL Server, DB2, Flat Files (Unstructured Data Objects) and Mainframes.
  • Experience working in large team environment including, designing Table Load specification for Target tables using Source tables, Load instructions, table type and Logical Keys.
  • Experienced working with Data Warehouse / Business Intelligence Team, implement technology improvements and monitor
  • Closely worked with Data Architects and System analysts in formulating the Table Load specification in Informatica.
  • Involved in gap analysis to compare its actual performance with its potential performance and Change Data Capture(CDC) to identify the changes.
  • Hands on experience in tuning mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings, and session.
  • Expertise in writing/debugging/enhancing UNIX Shell Scripts.
  • Expertise in creating and understanding the Star and Snowflake Schema, Data Modeling, Fact and Dimensional Tables and Slowly Changing Dimensions.
  • Good work experience in Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
  • Excellent interpersonal and analytical skills with strong ability to communicate effectively.
  • Expert knowledge of SQL, T-SQL, DTS/SSIS, Analysis Services.
  • Good Experience using the MS SQL Server 2000/2005 technology base.

TECHNICAL SKILLS

ETL Tools: Informatica 9.1/8.6.1/8.1/7.3/6.1/5.1 (Power Center, Power Connect, Power Exchange, Designer, Workflow Manager, Repository Manager), IDE/IDQ 9.0 Classic, Informatica Analyst 9.0, Informatica Developer 9.0, SSIS, Data stage.

OLAP/DSS Tools: BOXI(Designer,Desktop/webIntelligence),COGNOS10.x/8.x/7.3(Framwork Manager, Report Studio, Query Studio, Impromptu PowerPlay and Visualiser), OBIEE

Scheduling Tools: Autosys, Tidel and Maestro.

Databases: Oracle 11i Apps /10g/9i/8i, Teradata, SAP, DB2/UDB Mainframe, SQL Server, COBOL, Sybase, Seibel.

Testing Tools: Win Runner, QTP, Quality Center and Load Runner

Web Technologies: JSP, J2EE,VBScript, HTML/DHTML, XSD, XML, MQ Services

Modeling Tools: Erwin, Visio, Rational Rose, Rational Clear Quest.

Technologies: Java, C, C++, .Net, UNIX, LINUX, Shell script, SQL, PL/SQL, Toad, TSQL and Perl.

DBTools/Design: TOAD, SQL* Plus, SQL* Loader, UML, RUP

PROFESSIONAL EXPERIENCE

Confidential, Mahwah, NJ

Data Architect (Data Modeling/Informatica/Cognos)

Responsibilities:

  • Responsible for gather and understanding the user requirements from the reporting prospective.
  • Modified and creating couple of data model’s based on the user requirements after doing a Source and Target Analysis
  • Created the Dimensional Data Model (Star Schema and Snowflack schemas) as per the Business Need’s and worked extensively in developing the model in Cognos Framwork Manager .
  • Extensively worked on Designing and creating the ETL(Informatica) Mapping to get the corresponding information from ORACLE ERP System to Datawarehouse and from DW to Data Mart.
  • Prepared the Detail Design and Data Mapping Documents after reviewing through the Technical and Functional Document provided by the Client through BA.
  • Worked Extensively in developing the ETL(Informatica) Mapping and assigning the task to the offshore team as per the Detail Design and Data Mapping Document.
  • Used Teradata Fast Load for Multi-sessioned parallel load utility for initial table load in bulk mode on a Teradata Database and to load more than one table, multiple jobs need to be submitted.
  • Used the Teradata Multi Load for high-speed batch creation and maintenance of large databases and for command-driven parallel load utility for high-volume batch maintenance on multiple tables and views of the Teradata Database.
  • Used BTEQ for Teradata Query’s and Teradata Parallel Transporter.
  • Working Extensively on ETL (Informatica) Performance tunning activity and achieved significant amount of improvement in the Daily, Weekly, Monthly and yearly workflow’s .
  • Used Cognos Framwork Manager to add couple of tables in each of the model and defined the Logical relationship with in the model and publishing the packages.
  • Responsible for leading the Offshore team, assigning the task and conducting status meeting on a daily basis to meet the estimated deadlines.
  • Working in Agile Environment and Coordinating with the Offshore Team on a Daily basis.
  • Working closely with the DBA in the process of improving the Database performance and Utilization.
  • Responsible for Upgrading /Installation and Configuration of Informatica Power Center 9.1

Confidential, Stamford, CT

Data Architect/Sr.Informatica Developer/ Analyst

Responsibilities:

  • Responsible for gather the user requirements and discussing with Business Analysts to acquire Data requirements.
  • Responsible for Leading the Onsite/Offshore Team by dividing the task’s within the team and make sure that the work is running parallel and in an effective way.
  • Responsible for designing the Data Dictionary for each of the application and discussion with the client in identifying the critical tables in each of the application and preparing the plan/design activity for profiling the tables in each of the application.
  • Used IDE(Informatica Data Explorer) to work extensively on Column Profiling as well as Cross table profiling to identify the quality of the data in each of the table as well as the referential integrity constraints against the tables with in as well as with the other applications.
  • Responsible for defining the Technical validation and discussing with the business to define the business validations and generating the report along with creating the reference data for Data cleansing.
  • Use Informatica Data Quality (IDQ) Workbench to export and import plans to and from local repository.
  • Used IDQ to copy local files to the service domain with the File Manager and generating Score Cards.
  • Responsible for bringing the Marketing data of GenRe from Different Applications to Landing Zone(Oracle) and loading the data in to Salesforce (SFDC) which is our SAAS service and once we work on visits, Mailing list, company Info on the marketing data we are going to extract that data from SFDC and load the data in to our Landing Zone(Oracle) and generate the reports with the help of BO XI as our reporting Tool.
  • Worked with Informatica power Exchange navigator to extract the data from Mainframes (VSAM, ADABA) files by creating the data maps and using those definitions in Power Center as the sources and creating the mappings to load the data in to the staging area for profiling in Oracle.
  • Worked on Unix Shell Script to pick the data files from one server and place them on to another on a regular base.
  • Responsible for Guiding the App’s support team in the Execution and issue resolving.
  • Used Erwin 7.1 to create the Logical model from the existing Physical Data Model to identify the relationship between tables.
  • Used Data Warehouse Master Data Management (MDM) in source identification, data collection and data transformation.
  • Responsible for the Data Integrator using the Business Objects and generating Reports.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules

Confidential, San Rafael, CA

Sr.Informatica Developer/Data Analyst/Lead

Responsibilities:

  • Interacting with Client and Business Analysts to acquire Data requirements, Client requirements for the project.
  • Responsible for Leading the Offshore Team by guiding and assigning the task’s to them.
  • Prepared and presented a successful software development project plans along with the Project Manager with MS Project.
  • Responsible for modifying the Star Schema as per the requirement of the client and creating the Data Mapping Document for the ETL Developers.
  • Worked on both Extracting and loading the data from and to Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).
  • Responsible for Creating the Solution design to define the Business Flow, Detail Design Document for the developer’s and preparing the Sub Doc’s for the Code Migration from Dev to QA and QA to UAT and UAT to Prod.
  • Worked Extensively on bringing the relevant data from Sales force as per the requirement from the client and loading the data into the Data mart’s
  • Created the Dimensional Data Model (Star Schema) as per the Business Need’s and worked on creating the Universe as well as generating the BO Reports (Web and Deski Reports).
  • Experience with the Informatica Data Transformation module.
  • Created a generic parser using Informatica B2B DT( Data Transformation) studio that can parse any document Excel or Delimited for any customer.
  • Responsible for doing the Data profiling with IDE(Informatica Data Explorer)
  • Use Informatica Data Quality(IDQ) Workbench to export and import plans to and from local repository.
  • Used IDQ to copy local files to the service domain with the File Manager and generating the Score cards.
  • Worked with Multi-Load, Fast Load and used BTEQ for hitting the Teradata.
  • Worked on Unix Shell Script to pick the data files from one server and place them on to another on a regular base.
  • Responsible for Guiding the App’s support team in the Execution and issue resolving.
  • Used Erwin 7.1 to create the Logical and Physical Data Model.
  • Used Data Warehouse Master Data Management (MDM) in source identification, data collection and data transformation.
  • Responsible for the Data Integrator using the Business Objects and generating Reports.
  • Worked with SAP BW and Sales force as the Source and the Target and loading the Data in to ODS and to Enterprise Data warehouse
  • Responsible for Loading the Active Directory by calling the SSIS Package from Informatica.
  • Wrote SQL, PL/SQL, stored procedures, triggers and cursors for implementing business rules
  • Involved in Creating the Labels and Deployment Groups
  • Worked with Confidential DTS for extract, transform and loading data to SQL server database.

Confidential, San Jose, CA

Lead Informatica Developer/Data Modeler/Architect

Responsibilities:

  • The core of the project was supporting the Brocade Enterprise Data warehouse which covers supports a vast variety of KPIs, metrics and analysis paths catering to enterprise wide finance, operations, sales, marketing business areas and to implement processes and procedures to meet the SOX Compliance.
  • Worked expensively on TSQL and creating the stored Procedures to implement the complex business Logic.
  • EDW is primarily sourced from Oracle Apps 11i, Salesforce.com and legacy systems (Deals Desk, CFT and Rugby).
  • Worked on Sales Force (Sales, Service & Support, Analytics, and their collaboration platform).
  • Responsible for extracting the data from Sales force and load the data in to our staging Area and generate reports.
  • By integrating the Sales force saved precious time and energy. This powerful integration allowed us to create automated marketing campaigns based on important metrics, through a single, one-to-one marketing platform.
  • In addition to this there are many maintenance and software upgrade projects that are in progress in DW space
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, Worklets, re-usable transformations, etc.
  • Worked Extensively on Data Integrator with Business Object’s XI Reporting(Desktop, Desktop Intelligence and Web Intelligence) as per my discussion with the Users.
  • Involved in Data Profilling and data loading with the help of IDE/IDQ by defining the rule’s and implementing the transformation logic.
  • Worked with OBIEE for Dashboard reporting by building task from Admin.
  • Responsible for Leading the Offshore team (Informatica Developer/BO)
  • Analyzing the RFE’s, modifying the mapping based on the request, creating the CCB’s
  • Responsible for Migrating the Code from DEV to SPT to PROD and the whole SDLC
  • Responsible for setting up the expectations to the Users and defining the time lines
  • Working on creating as well as loading the data in to Star and Snowflake Schema (Fact and Dimensional Tables) by Slowly Changing Dimensions.

Confidential, Redmond, WA

Sr. Informatica Developer / Admin

Responsibilities:

  • Attending the Informatica Corporation Demo’s on Informatica 9.0.
  • Installation of Informatica Power Center 9.0,Informatica Analyst and Developer 9.0
  • Installing the EBF Patched such as EBF 5125 and EBF 5137 to the Informatica 9.0
  • Responsible for Design, implementation and defining the folder structure for the Informatica Architecture.
  • Raising Service Requests for the bugs that we found in Informatica 9.0, tracking and leading the sessions with Informatica Corporation.
  • Worked with IDE 8.6.1/9.0 Classic and IDQ 8.6.1/9.0 Classic for the Data Profiling and Data Validations.
  • Responsible for analyzing and create data quality scorecards
  • Created Mapping in Power Center and Informatica Developer 9.0 to satisfy the business need based on the results from the IDE and IDQ from the business users and scheduling the workflows.
  • Created couple of Stored Procedures to Access all the tables and views from all the databases with in the servers and loading the results in to one of the databases in the 3rd server for the Data Profiling by the business users.
  • Used this Stored Procedure Transformation in couple of Mappings and used SSRS as the Data Integrator
  • Worked with Informatica developer 9.0 and Informatica Analyst 9.0 for loading and profiling the data from various applications.

We'd love your feedback!