We provide IT Staff Augmentation Services!

Lead Data Architect / Sr Data Modeler Resume

5.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Over 14 years of IT experience in System Development Life Cycle - Data Architecture, Analysis, Design, Modeling, Development, Implementation and Support wif extensive exposure in MDM, Database, Data warehouse, ETL, SOA processes and methodologies wif major focus on Best Data Warehousing practices (Kimball & Bill-Inmon), Business Intelligence and Database Applications
  • Seasoned Data Architect, Engineer, Technologist wif thought leadership for strategy and architecture defining technology roadmaps and practicing architecture frameworks such as TOGAF and Zachman
  • Expertise wif development of conceptual, logical, and physical data models for SQL and NoSQL ecosystems; ADS/APPs, Data Lakes, data quality; SQL and PL/SQL development; extract, transform, and load (ETL) strategies; tuning; replication; and solving object-relational mapping challenges; analysis of business and technical requirements, and mapping them to implementation strategies and design of EDW systems architecture
  • Over a Decade of Hands on experience in diverse data domains like Finance, Brokerage, Banking, Insurance, Telecom and Healthcare Data warehouses, relational Databases and Data marts for various verticals in Teradata, Oracle, DB2 and SQL Server wif an extensive use of ETL technology suites, Dimensional Modeling Design wif both Kimball and Inmon approaches, Data Management and Data Integration Techniques besides Metadata and Master Data Management
  • ETL expertise wif Informatica suite like Power center, MDM and IDQ tools for data warehousing projects from inception to implementation and delivered high quality solutions wif required SLAs surpassing the business expectations in the implementation of a Data Cleansing and Customer Deduplication and Data transformation and Standardization efforts
  • Expertise in performance tuning techniques at system, session, mapping, transformation, source and target levels to optimize the batch time for ETLs TEMPeffectively Tune and Monitor RDBMS both at application and database level for better performance
  • Expertise wif Solution Designs, Technical Design Documents, HDDs and LDDs after translating BRDs, FRDs and creating mapping documents wif the required Transformation logic from Sources to Targets to support for Facts and Dimensional models wif ErWin, Visio and Embarcadero
  • Experience using Teradata and extensive usage of TD Load Utilities, Teradata Data Mover (TDM) and TPT
  • Experience wif BI tools like OBIEE and SAP BI and extensively working knowledge to design dashboards, develop reports, pages besides schedule and automate reports for the business and creating alerts on data availability and implementing multiple data access and security levels for the same to serve various group of business groups
  • Extensive experience in preparing efficient UNIX Scripts, XML and well-tuned SQL Queries to load, transform and validate data on Teradata, SQL Server and Oracle
  • Very strong SQL/ PL-SQL/ T-SQL *Loader skills r Windows and Unix shell script environment including use of nested table expressions, unions, multi-table joins, scalar functions, outer joins, derived columns and creative use of the functions, operators etc
  • Conducted successful proof-of-concept tests for various data solutions wif Relational DB and BigData ecosystems wif ACID, BASIC data compliances for SCDs wif CDC techniques and distributed processing of large data sets wif Spark, Mongo DB, Netezza, Vertica tools and real time CDC solutions wif Power Exchange and GlodenGate for cross-environmental data load and access according to the SLA
  • Excellent communication skills, Business interaction skills and strong interpersonal skills to communicate TEMPeffectively wif the Technology Executive Leadership to champion the new ideas
  • Lead large on-site and off-shore teams for several EDW initiatives wif design, define coding standards and best practices for the ETL development, deployment, change management and production support processes and successfully present the ETL Data Integration efforts to influence, appreciate and gain support for the projects from the Executive and Technology

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.0/8.6.1/8.1.1/ 7.1/6.2/5.1, IDQ, AbInitioBI Tools OBIEE 10.1.3.4, Business Objects v6.5, 5.0

Languages: C++, Visual Basic, ASP.NET

Scripting tools: XML, JASON, VB script, JavaScript, HTML, DHTML, UNIX Shell scripts

Databases: Teradata (v2R6, v2R5, v2R7 and TD12), Oracle (10g, 9i, 8i, 7.x), DB2, Data Lake, SQL Server 2008/2005/2000, Mongo DB, Vertica, Hyperion System 9 MDM, IBM System/390, No-SQL, MS Access.

Tools: /Utilities: ERWIN 9.0, SPARK, AWS, ER Studio, Teradata (BTEQ, Fastload, MultiLoad, TPump, FastExport), AbInitio GDE/Co-Op GDE/C0-Op, SQL Assistant, Query man, Clarity 7.5.3/8.x, TOAD, AUTOSYS, IIS, UC4, AppWorx, IBM Tivoli scheduler, JCL, MQs, Control M, TIBCO Rendezvous 6.x, CA Unicenter Workload Control center, MS office, Visio, SQL*Loader

Operating Systems: UNIX, Linux, IBM InfoSphere Information Server, MS-DOS and Windows NT/95/98/2000

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Lead Data architect / Sr Data modeler

Responsibilities:

  • Delivered end-to-end architecture, design and lead the Data modeling activities that have been taken up in from data architecture deliverables standpoin.
  • Closely work wif LOBs and Tech/functional team of various SORs, SMEs and Data teams to architect, design and develop the solution for data delivery on Information to the client teams to facilitate Reg reporting of Global wholesale banking Technology group.
  • Successfully lead an interim data delivery initiative for 2017 on time wif 10+ member on-site and Offshore teams towards data sourcing, integration and provisioning initiatives also keeping in view of the strategic Data Lake that’s in works
  • Provide in-depth analysis of the data, process, design and systems analysis of the existing Client, OLTP systems to evaluate the current state of data provision to architect and design future state of Reg reporting in both Interim/strategy perspectives
  • Preparation of Datamodel design documents (Conceptual, Logical and Physical), Solution designs for the review of Architecture leadership for all Data initiatives taken up
  • Collaboration wif Tech/Business teams to physicalize/implement the Physical models for the planned data initiatives
  • Audit and review the Design, Development, Data analysis, Data profiling deliverables and Data delivery plans and help wif producing Mapping documents, Data dictionaries

Environment: ER Studio, SQL Server, SSIS/SSRS, JASON, MongoDB, Oracle, DataLake/APP, AIX 5.3, Excel, WinSCP (ftp), Windows XP, MS Sharepoint, JIRA, SVN, TOAD

Confidential, Charlotte, NC

Lead Technology/Data architect / Sr Data modeler

Responsibilities:

  • Delivered end-to-end architecture, design and lead the implementation of initiatives that’s been taken up in wif SOA deliverables and data architecture deliverables.
  • Closely work wif Business and Techno Functional team of various LOBs, Data modeling teams to architect, design and develop the solution for data delivery on Information to the stakeholders from various quarters of Global wholesale banking Technology group.
  • Successfully lead two major data delivery initiative of 6MM Dollar each on time wif 20+ member on-site and Offshore teams towards data sourcing, integration and provisioning initiatives for Deposit Data marts for Wholesale Banking to provide it to LOBs/BI environments downstream
  • Provide in-depth analysis of the process, design and systems analysis of the existing systems to evaluate the current state of data provision to architect and design future state of TrADS data provision
  • Preparation of Design documents (HLD), Solution designs for the review of Architecture leadership for all Data initiatives taken up
  • Collaboration wif Tech/Business teams to partner and support the delivery of Conceptual, Logical and Physical models for the planned data initiatives
  • Audit and review the Design, Development, Data analysis, Data profiling deliverables and Data delivery designs and Mapping documents for the Development teams
  • Guide and champion the efficient data delivery processes from TrADS on GWBT side

Environment: ER Studio, ERWIN 9.0, Informatica Power Center 9/8.6, Informatica MDM, AWS, OBIEE, Microsoft Visio, IIS, WCF, UNIX Oracle 12C, Exadata X6, Hadoop, SPARK, JASON, Teradata 14, SQL Server, Putty AIX 5.3, Excel, WinSCP (ftp), Windows XP, MS Sharepoint, JIRA, SVN, TOAD

Confidential, Charlotte, NC

Lead Data Architect / Data modeler consultant

Responsibilities:

  • Designed, delivered comprehensive Data architecture processes like Conceptual, Logical and Physical Data modeling, Data profiling, Data Analysis, Schema design, Data design document deliverable like Mapping documents and contribute to Data dictionaries, Data catalogues for enterprise wide MDM, publishing and maintaining versions of DDL scripts
  • Worked wif Enterprise Architecture group to conform to the Data quality, standardization policies of Data governance for all the Data solution designs that are produced to get reviewed and approved
  • Successfully spun out several Data model deliverable and Data architecture solutions for consolidated reporting for all Asset classes (Securities, Derivatives, Private Equities and Mortgage) to deliver data for Risk compliance, Real Estate, SAGIC reporting and to in corporate Real Assets into the FDR data warehouse so BI can leverage reporting on the same for the LOBs.
  • SOA architecture deliverables for the restful web services utilized for the data initiatives wif Amazon web services to subscribe and ingest data for the standardization and transformation stages during the data sourcing and provisioning processes
  • Worked closely wif Tech leads and Development teams all data needs wif creating data objects for multiple initiatives that are in progress in parallel sprints, providing any design changes that are required from modeling/architecture perspective to update the living documents me.e. Logical/ Physical Models, DDLs for tables, views, Materialized views, keys, Indices etc.

Environment: ER Studio, Oracle 11g, Informatica Power Center 9/8.6, AWS, Informatica MDM, Teradata TD14, OBIEE, Microsoft Visio, SQL Server, UNIX, Putty AIX 5.3, Excel, WinSCP (ftp), Windows XP, MS Sharepoint, JIRA, SVN, TOAD

Confidential, Charlotte, NC

Data Modeler/ Solutions Architect

Responsibilities:

  • Design, Develop and Deploy DW/BI Data design deliverable for Front/Middle/Back office trading applications for various source systems like PAM, Amber, Bloomberg etc. and Data Marts in the CITT Data warehouse to churn the real time BI/data solutions for Capital Investments and Global Finance LOBs and Top leadership teams to generate wif cutting edge reporting solutions, enable recons for the Trades/Holdings on a real time daily operation in accurate and time-sensitive manner.
  • Provide strategy, modeling and architecture for consumer marketing technology and various other teams to develop, present and optimize various initiatives in business intelligence and enterprise data initiatives across multiple lines of businesses.
  • Building road maps develop solution decks to conceptualize the business use cases wif technologies like Informatica, DataStage, Teradata, Netezza, Hadoop and Vertica.
  • Provide architectural support for Conceptual/Logical/Physical data modeling Data governance and review of solution designs for best practices and participate in architectural reviews of the HLD and LLD for the new initiatives.

Environment: Informatica Power Center 9/8.6, ER Studio, Teradata TD12, Oracle 11g, MDM, SQL Server, IIS, Autosys, Perl, XML, OBIEE, Erwin, UNIX, Putty AIX 5.3, Excel, WinSCP (ftp), Vertica, Windows XP, MS Share point, JIRA, SVN, TOAD

Confidential, Charlotte, NC

EDW/ Data Architect

Responsibilities:

  • Responsible for defining the information architecture and technology infrastructure along wif the design of multi-Dimensional data models and the Business Intelligence delivery solutions for the Dimensional data model and enhanced it to meet the growing requirements due to corporate merger and data reconciliation of MDMs.
  • Inventoried existing EDW design, ETL Load processes, EDW systems, business functions and level of recovery planning already in TEMPeffect. Compared to existing DR capabilities and developed detailed deltas for hardware, software, personnel etc for various application areas. Developed plan to incrementally improve DR capabilities for the improved data protection.
  • Reverse-engineered logical data models from four poorly documented, existing physical data models that supported storage and retrieval of Consumer and Commercial Loans data, supporting the re-factoring of the database design to optimize EDW deliverables.
  • Designed, Developed, optimized various medium to complex ETL processes wif ETL Jobs, mappings wif Workflows/ Sequence Jobs for BI solutions sourcing data from various types of source data from RDBMS, Flat Files, XMLs etc into the data mart(MDM).
  • Advised and reviewed ETL processes extensively wif various ETL stages / transformations like Transformer/ SCD, Funnel, Aggregator, Expression, Filter, Source Qualifier, Rank, Joiner, Expression, Joiner, Union, Sequence Generator, Update Strategy, Lookups and Stored Procedure etc. to develop complex ETL processes and tune and optimize the existing processes to develop data solutions to transform business rules into the ETL data deliverables.
  • Tuned and modified Jobs, Sequence Jobs, Mappings, re-usable components like Mapplets, sessions, Worklets, Workflows, XML source definitions for the data to be loaded.
  • Worked on the ETLs developed to confirm the data quality, quantity, accuracy and efficacy of the ETL transformations designed and developed, and expertise in designing and tuning DB2 and Oracle SQL queries to optimize the load process besides the designing the mappings.
  • Created Data catalogues, mapping documents wif the required Transformation logic, Sources and Targets for Facts and Dimensions and Designed the ETL processes using Informatica Power Center 8.6.1 to extract, transform and load data from multiple input sources like Oracle, flat files, DB2 to the Teradata Data marts and the MDM.
  • Conducted successful proof-of-concept tests for various optimized ETL solutions for SCDs wif CDC techniques and distributed processing of large data sets wif Hadoop, Netezza tools.
  • Designed and Developed the BI solutions for both full data refresh and as well as incremental data loads wif ETL process using triggers for CDC (Change data capture) to sync the load process into MDM in accordance wif various source systems for Real-time Data Migration to Teradata wif Load Utilities and exercised PPIs, NUPIs, SUPIs aptly.
  • Provided an expert understanding of metadata, relational and dimensional modeling of the enterprise data and analyzed the fact and dimensional tables beside logical tables for the presentation layer for BI solutions wif OBIEE.
  • Exposure to design, develop BI solutions in OBIEE wif pages, reports and dashboards wif data loaded into various relational databases wif ETL processes.
  • Worked wifin an Agile development framework to provide fast paced, time sensitive ETL solutions using JIRA system to exercise on tasks requested by the business by interacting constantly wif business groups and implementing code releases in Packages and Hot fixes for emergency issues after all the testing processes and UAT is complete.
  • Worked wif various external teams to provide solutions for the reporting needs of the business wif accurate data and resolve any issues reported instantly wif hot fixes and streamlined the month-end-loads to provide robust data on time, every time.
  • Lead and guided the EDW BI teams wif best practices, revised coding standards while developing BI solutions and lead multiple releases in packages wif all the Architecture reviews, change management processes, code reviews.

Environment: ERWIN 8.0/8.5, Informatica Power Center 8.6, SQL Server 2008, Teradata TD12, Oracle 10g, MDM, Siebel, DB2, XML, OBIEE, Microsoft Visio, UNIX, Putty AIX 5.3, Excel, WinSCP (ftp), Windows XP, Autosys, MS Share point, JIRA, SVN, TOAD.

Confidential, Buffalo, NY

Lead EDW Architect / Tech Lead

Responsibilities:

  • Worked wif EDW teams to accomplish various EDW initiatives, objectives wif Design and implement data models according to the project requirements, participate in data architecture strategy review, data provisioning, and data policy making initiatives.
  • Contributed wif data architectural designs, Developed and implement strategies, policies and standards for data architecture in the EDW practice. Ensured that designing, recovery and implementation of a database is carried out TEMPeffectively and involved in data cleaning procedure by removing old, irrelevant data in consultation wif the stakeholders.
  • Interacted wif various business users to gather the business requirements and translated them into technical specifications for Solution Design documents, Technical Design Documents, HDDs and LLDs to propose the BI solutions to the business wif Data models, ER Diagrams, Mapping Documents etc.
  • Extensively researched and tested EDW tools for data profiling; WID key generation, integration of WID key generation and data quality tools into the ETL pipeline; Sweep match clustering to TEMPeffect entity resolution for multiple instances of objects of interest for updates to MDM.
  • Contributed to Design, Develop, Test, Deploy and support ETL applications for the SOLR data loads and extensive use of Teradata for optimum Data loads on the target Data mart wif Load Utilities, TPT and TDM and recommending proper indexing and partitioning.
  • Created, modified ETL Jobs, Mappings, Mapplets, sessions, Worklets, Workflows, XML source definitions in for the data to be loaded from Sub-version XML format to the target databases.
  • Optimized and tuned the existing ETL load processes to fine-tune the data loads and removing bottlenecks in Oracle, DB2, XML source data to Teradata Data mart and transformation levels to quick retrievals from SOLR search engine for the users
  • Designed and implemented Data Quality plans for the data being loaded to the back end of the search engine to make sure that it adheres and publish the correct product, plan information for the customers and employees alike.
  • Exposure to design, develop BI solutions in OBIEE wif pages, reports and dashboards wif data loaded into various relational databases wif ETL processes.
  • Expertise wif building a repository, dashboards, defining security, Cache management wifin OBIEE wif Physical table Cache, Event Polling according to the frequency of the data updates on source data systems.
  • Expertise wif metadata and defining facts and logical fact besides optimizing performance on OBIEE solutions wif optimum aggregations, caching and adding Logical tables in BMM Layer etc.
  • Post the XML data to the SOLR portal when the target Database is load is complete to compare the data
  • Provided Technical Design and Solution Design Documents for the implementations lead and took part.

Environment: Informatica Power Center 8.6, IDQ, Teradata v2R7, Oracle 10g, MDM, Siebel, Erwin, OBIEE 10.1.3.4, Teradata SQL Assistant, Eclipse, Altova XML spy, Microsoft Visio, UNIX, Putty AIX 5.3, SQL Server 2008, Excel, WinSCP (ftp), Windows XP, Autosys, SQL Developer 1.0, TOAD, MS Share point, SOLR

Confidential, Charlotte, NC

Lead ETL Systems Engineer

Responsibilities:

  • Provided support in database architecture of the organization through database design, modeling and implementation and developed plans, strategies and standards for data wifin the EDW setup.
  • Analyzed the end-to-end scope of work and design architectural solutions accordingly, integrate all data in the data management platform enabling accessibility and ensure that the database base is secured and updated on cross-environmental Database platforms.
  • Contributed wif Design, Develop, test, deploy and supporting the ETL processes to Extract translate and load data from OLTP to OLAP environments working wif all other teams (database management team, development team and Data Governance team) and users to provide best solutions by maintaining the best standards of practice
  • Deployed, administered and ensured the Change management process for various ETL processes developed and supervised for Test (ST, FT) and Production environments.
  • Implemented Type II Slowly Changing Dimensions Methodology to keep track of historical data and created Reusable Transformations and Mapplets in the designer using transformation developer and Mapplet designer according to the business requirements.
  • Deployed and scheduled scripts and jobs into production wif Star Team, SM and Control M.
  • Developed several Informatica Mappings, Mapplets and Workflows to load data from relational, XML and COBOL sources into the various data marts in Confidential EDW.
  • Implemented Slowly Changing Dimension (SCD) as per business requirement for saving history using InfoSphere Datastage 8.0.1 for dimension tables in Star Schema.
  • Created FastLoad, FastExport, MultiLoad, TPUMP, and BTEQ scripts to load data from Oracle database and Flat files to Data marts in Teradata.
  • Expertise wif building a repository, dashboards, defining security, Cache management wifin OBIEE wif Physical table Cache, Event Polling according to the frequency of the data updates on source data systems.
  • Expertise wif design, develop BI solutions in OBIEE wif pages, reports and dashboards wif data loaded into various relational databases wif ETL processes.
  • Expertise wif Siebel admin tool to setup different User IDs, configure access authentication and configuring LDAP for folders and directories.
  • Expertise wif configuring the Data Level Security to serve the various scope of report data on accordingly to various levels of users wif OBIEE.
  • Developed shared containers in Quality and Datastage jobs to increase reusability which saved testing and designing time and can bring down the Total Project Time
  • Scheduling jobs in Information Analyzer for profiling data from Different sources which involved identifying the key fields for consumer, product details, table constraints, primary, foreign key relations.
  • Created UNIX scripts for various purposes like FTP, Archive files and creating parameter files.
  • Created Oracle stored procedures to delete duplicate records from warehouse tables.
  • Worked wif Informatica servers to support and troubleshoot issues wif various ETL by using session logs and bad files to trace errors occurred while ETL process.
  • Responsible for troubleshooting, identifying and resolving data problems, worked wif analysts to determine data requirements and identify data sources, provide estimates for task duration.
  • Adhered to the SOX compliance and provided various documentation for support ETL processes
  • Performed Unit Testing and Integration Testing of Mappings and Workflows and involved in 24x7 production support on on-call basis wifin the ETL team.

Environment: Informatica Power Center/Power Mart 8.6/7.1.3, IBM InfoSphere Information Server, SQL Server 2008, IIS, Teradata V2R6/R5, Oracle 10g, Siebel, OBIEE 10.1.3.4, Teradata SQL Assistant 7.2, Control M, SM Clear case, BEA Weblogic Server 9, Borland StarTeam, Microsoft Visio, UNIX, Putty AIX 5.3, Excel, WinSCP (ftp), Windows XP, PERL scripting, SQL Developer 1.0, TOAD, MS Share point

Confidential

Sr. Informatica ETL Developer / Analyst

Responsibilities:

  • Developed complex mappings to implement type 2 slowly changing dimensions for handling history requirements using transformations such as the Source qualifier, Aggregator, Expression, Static Lookup, Dynamic Lookup, Filter, Router, Rank, Union, Normalizer, Sequence Generator, Update Strategy, and Joiner.
  • Developed Informatica Coding Standards documents, naming conventions, and mapping documents for all the subject areas wif business logic from source to target to be referred during the ETL development.
  • Expertise in Data Analysis, Fact and Dimension modeling, Normalization processes.
  • Worked extensively on developing reusable components like sessions, mapplets that are being used across many projects for various data transformations.
  • Extensive experience in developing and using Teradata load utilities like BTEQ, Tpump, Mload, FastLoad, FastExport to load the OLTP data wif Informatica ETL process into Data marts in Teradata.
  • Worked on various ETL process to load the TCM/Shipment Data Marts using Fact and Dimensional table loads.
  • Expertise wif IDQ and ETL testing to confirm the data quality, quantity, accuracy and efficacy of the ETL transformations designed and developed.
  • Expert in designing and tuning DB2 and Oracle SQL queries to optimize the load process besides the designing the mappings.
  • Developed the triggers for CDC (Change data capture) to sync the load process into ODS wif and ETL Process developed for Real-time Data Migration.
  • Expert in data cleansing process wif various Informatica transformation components.
  • Expert in creating and deploying developed work-flows in to production wif Informatica Repository manager.
  • Expertise wif design, develop BI solutions in OBIEE wif pages, reports and dashboards wif data loaded into various relational databases wif ETL processes.
  • Expertise wif building a repository, dashboards, defining security, Cache management wifin OBIEE wif Physical table Cache, Event Poling according to the frequency of the data updates on source data systems.
  • Expertise Query repository tools, creating iBots to schedule and automate the reports, Using Logical Tables sources(LTS) and multiple LTS in OBIEE and maintaining Object level security wif Siebel Web Catalog.
  • Expertise wif Siebel admin tool to setup different User IDs, configure access authentication and configuring LDAP for folders and directories.
  • Expertise wif configuring the Data Level Security to serve the various scope of report data on accordingly to various levels of users wif OBIEE.
  • Created various SQL Scripts, XML and Teradata scripts for handling source dependent tables prior to load tables in the Data marts for DML operations.
  • Expertise in configuring xpath to source the fixed length file from Mainframe to UNIX and Teradata targets.
  • Did collect-stats on Teradata tables on a regular basis to enhance to load performance.
  • Designed logical primary keys to enable the ETL process from source to target tables.
  • Designed and developed various archive databases for very large databases at Union Pacific in helping the business analyze the data in crucial decision-making process.
  • Expertise in creating JCL jobs to kick off production jobs to source the data from Mainframe to external databases which are used as sources for ETL processes.
  • Excellent logical, analytical, business intelligence and communication skills to extract and analyze the business requirements for the projects developed.
  • Rich experience in working wif various tools like UC4, AutoSys, CA control center, OBI dashboard to schedule, monitor the jobs and test the delivered data.
  • Greater familiarity of mainframe process in doing x-path configuration for flat files being loaded into various databases wif a scheduled ETL processes.
  • Expertise wif building a repository, dashboards, defining security, Cache management wifin OBIEE wif Physical table Cache, Event Polling according to the frequency of the data updates oN source data systems. Expertise in designing TIBCO mappings, workflows and setting up repositories.
  • Lead offshore development team and involved in the production support process to resolve of various issues which are time sensitive and mission critical.
  • Worked as a regular on-call production support role on a regular basis for the round-the-clock production support and maintenance processes in production environment.
  • Greater attention to detail, presentation, orientation and training skills to present andsupport the products to various client departments.

Environment: Informatica PowerCenter/Powermart/ Power Exchange 8.6.1 / 8.1.1, IDQ, Hyperion System 9 MDM and HFM Build 9.2.0, Teradata V2R6, SQL Assistant, Oracle 10g/9i, TOAD 8.6, OBIEE, v6.5, UNIX, Unix shell scripting (Korn Shell), PL/SQL, IBM Mainframe- 2094 Series, JCL, SQL Server 2008, C#, AutoSys, UC4 5.0/6. 0, ERWIN 4.0, CA Unicenter Workload Control center, Windows NT.

Confidential

ETL Developer / Analyst

Responsibilities:

  • Involved in Design, Development, Prototyping, Testing and Documentation and for ETL applications through functional level application development process for the Data Loads from Oracle database to Hyperion FDM system and Hyperion Metadata Mgmt. Systems for Enterprise Data Systems Division.
  • Responsible to develop various Mappings using Informatica components wif transformation logic for the business requirements from the business owners wifin Finance department.
  • Worked wif creating, configuring the ETL environment for Dev/QA/Prod and Test instance, and used the RCS version control for any change control process for Informatica workflows.
  • Prepared XML, Unix shell scripts, and tuned SQL scripts for DML operations for Oracle and DB2 databases and created and schedule Jobs on a regular basis for ETL load processes in EDW Data marts.
  • Implemented Informatica configuration and well tuned setting up environment for Production and development environment.
  • Responsible for deploying Informatica mappings and workflows and execute them through the Power center and Shell commands.
  • Involved in thorough Data Analysis, Normalization, Data Modeling for Fact, Dimensions and Data Mapping review, including sources, staging areas, and target database.
  • Involved in developing and implementing jobs wif schedulers AutoSys and AppWorx.
  • Expertise designing dashboard applications, reports wif Business Objects.
  • Prepared transformation scripts and ANSI/SQL and PL/SQL queries for the ETL process.
  • Prepared Unix shell scripts wifin the mappings to enable the transformation logic for the business rules set by requirements to meet the ETL process goals successfully.
  • Data Loads over 3 million rows per load for the Actual and Forecast Data been implemented through various kind of transformations to implement the business rules to send the target data to Hyperion Financial Data Quality Mgmt. systems.
  • Used TOAD to extract data from Oracle 9i and performed data scrubbing, restructuring and consolidation besides loading and for Reporting purposes.

Environment: Informatica PowerCenter/PowerMart 7.1, Ab-Initio GDE, DB2, Teradata v2R5, Oracle 9i/8i, SQL Server 2005/2000, Business Objects 5.0, UNIX, Unix shell scripting (Korn Shell), PL/SQL, ANSI/SQL, Hyperion System 9 MDM and HFM Build 9.2.0, TOAD, Clarity 7.5.3, AutoSys, AppWorx, ERWIN 4.0, Windows NT

Confidential

Informatica Developer / Analyst

Responsibilities:

  • Involved in Analysis, Development, Testing, Documentation, Delivery and Production support for ETL applications through functional level application development process Loading data from Flat Files into the Staging area using Informatica PowerCenter 7.1.2
  • Worked on Informatica PowerCenter 7.1 tool - Source Analyzer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
  • Using the designer designed Mappings, which populated the Data into the Target Star Schema on Oracle 10g Instance.
  • Extensively used Router, Lookup, Aggregator, Expression, Lookup and Update Strategy Transformations.
  • Worked wif pre and post sessions and extracted data from Transaction System into Staging Area.
  • Used database objects like views, Partitioning for accomplishing the Complex logical situations.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Implemented and documented Change Control Request’s (CCR’s) for Order & Pre-Order domains.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Extensive use of AutoSys for Scheduling and running ETL loads on a regular basis.
  • Scheduled the tasks to be run using the Workflow Manager.
  • Involved in Data Model and ETL Mapping review, including sources, staging areas, and target database.
  • Performed Teradata database system setup for the billing applications. Followed the change in control process and Implemented the security policy according the corporate standard for development, Testing and Production instance.
  • Created Batch processes using FastLoad, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
  • Designed ETL process to extract data from SQL server 2005 and SAS to Teradata and performing data scrubbing, restructuring and consolidation for Web reporting.
  • Worked wif Endur, a risk management software system to feed the ETL data from Inforrmatica and scripts to source the data from Endor risk management platform.
  • Used TOAD and Infomaker for the data retrieval from the Teradata and Sybase databases and data set creation
  • Database Query Optimization and me/O tuning techniques have been used for performance enhancements.
  • Documentation for Requirement analysis, Test Plan, Production Support activities according to Comprehensive Delivery Process, a standard at Confidential

Environment: Informatica Power Center 7.1.2, AbInitio - GDE/Co-Op, Teradata v2R5, Unix shell scripting (Korn Shell), Oracle PL/SQL, SQL Server 2005, ANSI/SQL, Business Objects, AutoSys, Sybase, IBM AIX 5.x, TOAD, Endur, Infomaker, Clarity 7.5.3/8.x, ERWIN, Windows NT

Confidential, Charlotte, NC

Oracle Database Administrator

Responsibilities:

  • Installation, configuration and maintenance of Oracle 10g/11g Real Application Clusters (RAC).
  • Implement Grid based monitoring solution for large Databases and Installation of Cluster ware, configuring public, private and VIP.
  • Extensively worked in various areas of Data guard me.e. Installation, Recovery, RTA (Real Time Apply), Patching, Tuning, Switchover, Switchback and Failover.
  • Implemented and configured 11g grid control on RHEL.
  • Performed Logical backup of critical application databases using EXPDP/IMPDP wif a scheduled job (Cron Job).
  • Responsible for creating RMAN catalog Database and Registering the Databases in Catalog Database and Cloned production RAC instances to development and QA systems using RMAN duplication.
  • Worked on shell, Korn and Bash Scripting.
  • Experience in using 11g features such as Data Pump, Flash Back Recovery, ASM, AWR, and ADDM.
  • Involved in Defragmenting data blocks of table spaces for optimized performance. Also partitioned large tables to increase performance. Evaluated Oracle 11gR2 Gloden Gate for future purposes that is for reducing the complexity involved in Oracle Streams.
  • Configured Oracle Streams on Oracle 11g, 10g databases for data replication and monitored them using Grid Control.
  • Automated the Data Replication process using Materialized Views for reporting databases.
  • Used Oracle Enterprise Manager (OEM) 11g Grid Control for monitoring multiple database and notification of database alerts, and configured EM agents on multiple database servers.
  • Generated and automated Statspack/AWR reports from Oracle 11g/10g database and analyzed the reports for Oracle wait events, time consuming SQL queries, table space growth, and database growth.
  • Involved in SQL Query tuning and provided tuning recommendations to ERP jobs, time/CPU consuming queries.
  • Used Explain Plan, Oracle hints and creation of new indexes, identifying the join methods (Nested/Hash/Merge join/Sort merge join) between the row sources for tables, to improve the performance of SQL statements.
  • Addressed developers/testers requests to clone production databases for the purposes of testing using RMAN. Successful in installation/maintenance of Physical Standby database using Oracle Data Guard for Oracle 11.2.0.3 and Oracle 10.2.0.5 RAC databases.
  • Implemented switchovers on primary and standby databases as a part of planned maintenance activities.
  • Implementing Datapump, conventional export/import utility of Oracle 11g, 10g for re-organizing Databases/Schemas/Tables to improve the performance.
  • Used import/export utilities for cloning/migration of small sized databases and Datapump import/export to move data between 10g/11g environments.
  • Installed ASM environment and migrated databases from NON-ASM to ASM.
  • Used TOAD for database management.
  • Made optimal use of Oracle Support in resolving the SRs.
  • 24 X 7 Production & Development database support and administration.

Environment: Oracle 11.1.0.7, 10.2.0.5, IBM AIX V6.1, RHEL 5.2, RAC, ASM, RMAN, AWR, ADDM, SQL* Plus, SQL*Loader, OEM, TOAD 10.0

We'd love your feedback!