We provide IT Staff Augmentation Services!

Senior Data Architect Resume

5.00/5 (Submit Your Rating)

Mount Laurel, NJ

SUMMARY

  • A PMI Certified Project Management Professional (PMP) wif 23 years’ experience in Information Technology and experienced wif Data Warehousing using Informatica Master Data Management, Informatica Metadata Manager, Informatica Power Center, IDQ, Ab Initio, Data stage, Oracle, Teradata and Business Objects in industry verticals like Finance, Insurance and Health care.
  • Strong Professional experience in designing, implementing, and managing short and long - term projects in the Information Technology industry. Dynamic, energetic, and effective communicator wif experience in managing and motivating multi-disciplined teams and comfortable communicating wif senior management and systems professionals at all technical levels.
  • Experience wif Agile development - Organized and facilitated Agile and Scrum meetings, which included Sprint Planning, Daily Scrums, Sprint Check-In, Sprint Review and Retrospective.
  • Extensive Global experience in leading large, complex IT onsite and offshore programs for Fortune companies. Consistent track record of on-time and on-quality delivery.
  • Responsible for all activities related to the development, implementation, and support of ETL processes for large-scale data warehouses.
  • Strong experience wif Informatica MDM hub design, configuration and implementation.
  • Strong experience in developing Informatica MDM User Exits, Trust rules, Match and Merge rules, IDD and Hierarchies in Informatica MDM.
  • Experience wif Informatica Metadata Manager and Business Glossary.
  • Strong experience in Data Governance and developing rules wif Informatica Data Quality(IDQ) tools.
  • Extensive experience wif ERWIN, Power Designer, data modeling & data analysis using Dimensional Data Modeling and Relational Data Modeling, Star Schema and Snowflake schema modeling, FACT and Dimensions tables, Physical and Logical Data Modeling.
  • Database experience using Oracle, Teradata, BTEQ, MultiLoad, FastLoad, SQL, PL/SQL, SQL Loader, Stored Procedures and Functions.
  • PMP certified Project Manager wif proven track record for providing on-time project delivery and meeting or exceeding all target goals and objectives.
  • Extensively worked on Data Extraction, Transformation, and Loading wif RDBMS and Flat files.
  • Experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience wif Teradata’s Financial Services Logical Data Model.
  • Experience in providing the architectural solutions and creating the metadata model.
  • Experienced in all phases of the Systems Life Cycle, including project definition, analysis, design, coding, testing, implementation and support.
  • Expertise in designing Universes from scratch, migrating to different environments, designing Web Intelligence documents and maintaining them on Business Objects XI R2.
  • Experience wif Big Data Technologies.
  • Expertise in UNIX and UNIX shell scripting.
  • Designed and deployed well-tuned Ab-Initio graph (Generic and Custom) for ODS and DSS instance both windows and UNIX environments.
  • Demonstrated experience in developing and implementing formal project management methodologies and establishing standards.
  • Experience in coordination wif offshore development team
  • Excellent problem solving, communication, team and client management skills.

TECHNICAL SKILLS

Tools: Siperian Informatica MDM 9.1/9.5/9.6/10/10.1 , Informatica Power Center (8.x, 9.x), Informatica Designer, Workflow Manager, Workflow Monitor, Informatica Meta Data Manager 9.6.1, IDD, IDQ, Informatica Developer 9.6.1, Informatica Analyst 9.6.1, Informatica Power Exchange, Data Explorer, Ab Initio, Greenplum, Address doctor, Trillium, Oracle MDM and Data stage 7.5/8.0.

Database Utilities: PL/SQL, Toad 8.0/12.1, Export / Import, SQL * Plus, SQL * Loader and SQL developer

Scheduling Tools: Autosys, Maestro, Control -M and UC4

Databases: ORACLE 8/9/10g/11i, MS SQL Server 2000/05, MS Access, Sybase, Teradata V2R25/12/13, DB2, IMS DB and HBASE

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 3.5/4.0/7.1/9.0/9.5 , Power Designer v16.5/16.6.

Big Data Technologies: Hadoop (HDFS & Map Reduce), PIG, HIVE, HBase, Sqoop, Impala, HiveQL, Kafka, YARN, SCALA, R, SPARK and PYTHON.

Application Server: IBM Websphere 4.2/5.1/6

Web Tools: JDBC, JSP, JAV Script, HTML, XML, SOAP, Web Service and VB Script.

Reporting Tools: Business Objects XIR2, BI 4.0, BO XI 3.1, Crystal Reports, Tableau, Qlikview and Cognos

Defect Tracking Tools: Quality Center/ HP ALM

Operating Systems: Windows XP, MVS ESA, IBM-AIX 4.3.2 and Sun Solaris 2.6/7

Languages: PERL, SAS, MQ Series, Stored Procedures, Unix shell scripting, C#, Java

PROFESSIONAL EXPERIENCE

Confidential, Mount Laurel, NJ

Senior Data Architect

Responsibilities:

  • Interact wif the business users to understand the requirements
  • Developed the timeline and tracked the progress.
  • Developed project plan, monitor risks and execute risk response plans.
  • Negotiate wif customers to prioritize the resolution of issues.
  • Perform problem analysis and issue resolution.
  • Create the system design specification(SDS) document for the entire program and review wif the management
  • Involve in developing the design documents and review the designs.
  • Created logical and physical data Models using Power Designer.
  • Used mapping editor in Power Designer to map between source and target columns.
  • Standardized the data using reference tables.
  • Reviewed the data profiling output and identified the data anomalies.
  • Developed stored procedures and proto type SQL and HiveQL
  • Developed ETL to load the data into MDM Hub and to downstream systems.
  • Created trust, match and merge rules in the MDM Hub.
  • Developed landing, staging and base object models in MDM hub.
  • Create source to target mappings for source to data lake(EDPP) Hadoop ecosystem, from EDPP to pre-stage via using sqoop, pre-stage to stage layer and stage and RRDW layer
  • Configured the IDD application for different subject areas.

Confidential, Pleasanton, CA

Senior Data Architect

Responsibilities:

  • Involved in MDM v10.0 installation activities using JBOSS application server and informatica platform.
  • Interacted wif business users to understand the business need and prepared the requirement document.
  • Created the design document to master the data and build hierarchies.
  • Created Physical Data Objects and Logical Data Objects in Informatica developer.
  • Developed ETL code using Informatica Developer to bring data from different source systems and load into MDM staging tables.
  • Created the batch group jobs to load the data from MDM staging tables to Base object tables.
  • Configured match and merge rules to create the BVT.
  • Developed cleanse functions to standardize the data before loading into staging tables.
  • Created the Entity and Relationship data model to build the hierarchies.
  • Developed the ETL code using Informatica Developer to load the Entity and Relationship tables from the excel source.
  • Built the MDM hierarchy packages and profiles to determine how the data should be displayed in MDM Hierarchy Manager and IDD.
  • Built IDD application and configured the subject areas to govern the hierarchical data.
  • Built basic and advanced queries to view and modify the hierarchical data through IDD.
  • Built IDD user exists to extend the MDM functionality.
  • Configured the ActiveVOSto manage tasks, request and reports.
  • Performed the data profiling using IDQ tool to identify the data anomalies.
  • Analyzed the profile results, score cards and created the rules
  • Performed data validation and implemented the DQ rules using mapplet in IDQ
  • Created the dashboard wif rules assigned to the data quality dimensions
  • Developed custom metadata objects and added business glossary in Informatica Metadata Manager.
  • Created conceptual, logical and physical data models.
  • Copied data to and from Cluster and native OS. Ingesting data into HDFS using Sqoop.
  • Prepared the project plan and successfully delivered on-time, wifin scope and budget by monitoring and tracking the progress
  • Performed the risk management to minimize project risks.

Confidential, Pleasanton, CA

Senior Informatica Architect

Responsibilities:

  • Gather business requirements to migrate Rochade to Informatica Metadata Manager 9.6.1 for 8 regions.
  • Involved in design discussion and prepared the technical design document.
  • Recommended the architectural solutions and obtained approval from the Management.
  • Created source to target mapping and rules for the Teradata source to custom model.
  • Developed ETL mappings to create CSV file for metadata from Teradata resources.
  • Developed the custom model in Metadata Manager.
  • Created custom x-connect and linked custom models and out-of box models.
  • Developed the custom resources to load the metadata in Informatica MM warehouse.
  • Loaded Rochade clarity metadata in MM warehouse using out of box model.
  • Loaded the business glossary in Metadata Manager.
  • Created the Informatica MDM 10.0 installation document.
  • Involved customer and delivery management.
  • Motivated the resources to work as a team and encouraged to ensure high quality of deliverables.
  • Responsible for the management of technical project from initiation through implementation, includes: planning, analysis, design development and implementation.
  • Provides overall direction for establishing project requirements, priorities and deadlines. Integrates project plans into Program Plans.

Confidential, Sunnyvale, CA

Senior MDM and Data Integration Architect

Responsibilities:

  • Interact wif Sales Operations team to gather business requirements and translate to technical requirements.
  • Create Project Plan, schedule, priorities the tasks
  • Developed the logical and physical data model for Party data model.
  • Document and maintain technical requirements, including data flows, data structures, and data definitions.
  • Manage technical team on solution implementation.
  • Manage technical operations of Master Data Management and Aggregate Spend systems.
  • Configured Siperian Informatica MDM, setup Trust rules, match and merge rules.
  • Loading the various business applications data into landing, staging and base objects MDM hub console.
  • Created the duplicate data for data stewards review.
  • Used Merge Manager and Data Manager to perform merge and un-merge HCP and HCO data.
  • Developed customized user exits and performed address cleaning using IDQ.
  • Involved in MDM operations and informatica mappings enhancement to the existing functionalities.
  • Performed code migration using MDM metadata manager.
  • Support data issues wif Veeva sales force application.
  • Created profiles, score card and data quality rules.
  • Used address doctor to standardize the addresses.
  • Created Rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to deliver actionable insights.
  • Define operational procedures and metrics to ensure effective and efficient operations.
  • Developed best practices and guidelines for MDM operations.
  • Performed data analysis, data fixes and master the data as per the ad-hoc requests.
  • Define areas for improvement, and manage implementation of improvements.

Confidential, Boston. MA

Senior Data Architect

Responsibilities:

  • Build Informatica mappings to use Address doctor for address cleansing.
  • Developed Informatica mappings to get data from IDQ database for RSA and SAP data.
  • Performed impact analysis using Metadata manager.
  • Enhance the existing Informatica mappings to receive the data from 11i for all defined set of attributes to Informatica MDM HUB.
  • Data conversion from Informatica HUB to iMAP.
  • Created Data Conversation Design document and obtained sign - off from the Client.
  • Performed code migration from Dev to Test and Performance environment. Check-in, Check-out and apply label.
  • Adhered to the Informatica MDM Architecture to extract the data from the 11i to the LOAD Inbound staging environment for landing load to the Customer MDM HUB. Also Extract the data from customer MDM HUB and load the data to outbound staging environment. Integrated both batch and real time data.
  • Created the enterprise conceptual, logical and physical data model for MDM.
  • Defined guidelines and best practices for building the data model
  • Developed cleansed functions. Setup trust rules and match and merge rules and hierarchies in MDM.
  • Performed merge and un-merge the data using merge manager and data manager
  • Performed data quality using IDQ.
  • Support Functional, System testing and ensure defects are fixed on time.
  • Coordinated wif business, onsite and offshore team.

Confidential, Beaverton, OR

Sr. Data Architect

Responsibilities:

  • Developed high level design and technical specifications for Informatica mappings as per the business requirements.
  • Developed high level design the Source and Target mappings as per the business requirements.
  • Created the Informatica data quality architecture framework and obtained sign-off.
  • Developed conceptual, logical and physical data model for MDM.
  • Developed Informatica ETL mappings to load the data from external sources into MDM landing, staging and base objects MDM hub.
  • Developed MDM cleansed functions. Setup trust rules, validation rules, and match and merge rules.
  • Configured Business Data Director. Executed jobs using batch group and batch viewer
  • Analyzed the root cause of data issues in the Data Mart reports and fixed the ETL mappings.
  • Created theWeb Intelligence Report.
  • Performed data profiling using metadata manager.
  • Developed conceptual, logical and physical data models.
  • Developed best practices and guidelines.
  • Define the approach for error handling and reconciliation reports.
  • Support UAT testing, Deployment and Production stabilization

Confidential, Columbus, OH

Senior Data warehouse Consultant/Architect

Responsibilities:

  • Perform requirement gathering, coding, code review, testing and implementation for Annuity, Life Insurance and Retirement plans.
  • Used FSLDM for data modeling and developed conceptual, logical and physical data models.
  • Designing ETL jobs in Data Stage for the EDW which provides data for Business Objects reports.
  • Customized the Teradata’s FSLDM based on the requirements.
  • Extensively worked wif Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
  • Optimized various Mappings, Sessions, Sources and Target Databases.
  • Implemented Slowly Changing Dimensions (Type 2: versions) to update the dimensional schema.
  • Extensively involved in extraction of data from SQL Server, Oracle, Flat files and design of ETL process using Informatica and Data stage.
  • Implemented a New Project by changing the current DW and prepared detail design document.
  • Create standards, guidelines for data modeling and ETL processes.
  • Configured Customer and Products in Oracle MDM.
  • Interact wif end users in a support role to fulfill requests or resolve issues.
  • Created and reviewed Architectural decisions, architecture solution, performs re-engineering of architectures in order to create solution Blue print to meet project requirement.
  • Evaluated and reviewed Design Frameworks and Methodologies and approves design in order to achieve functional and non-functional requirements and conformance to the architecture.
  • Design and recommend changes to data warehouse schemas like Star and Snow flake designs.
  • Performed data conversion from Oracle to Teradata.
  • Performed capacity planning, database recovery and archival.
  • Participate in POC, Architects and validates complex technical solution.

Confidential, Charlotte, NC

Development Manager

Responsibilities:

  • Worked wif Business Analyst and business users to gather requirements and translate the requirement into technical specifications.
  • Plan, analyze, design, construct, test and implement new data feeds.
  • Prepare Metrics reports for each member of the team to generate the effort and schedule variance.
  • Plan, execute, and performed estimation of projects dat use Informatica.
  • Motivated the resources to work as a team and encouraged to ensure high quality of deliverables.
  • Responsible for the management of technical project from initiation through implementation, includes: planning, analysis, design development and implementation.
  • Provides overall direction for establishing project requirements, priorities and deadlines. Integrates project plans into Program Plans.
  • High level design and documentation of Teradata ETL architecture.
  • Designed and managed physical data model using Erwin for Data Warehouse and Data mart based on user requirements.
  • Worked on complete SDLC from extraction, Transformation and Loading of data using Informatica and Teradata.
  • Involved in ETL design for the new requirements and suggested feasible solution, to integrate the core and the new systems.
  • Developed MLOAD and FLOAD for loading data into Teradata.
  • Performance Tuning of the ETL Jobs and database.
  • Analyzed existing Universe and Database to enhance the existing Universe and created new Universe.
  • Utilized DESIGNER to design, create, maintain and distribute universes.
  • Data quality checking and interacting wif the business analysts.

Confidential, Columbus, GA

Sr. Data Modeler/Tech Lead

Responsibilities:

  • Tasks included conducting requirement analysis, designing and developing of a data model (star schema), the creation an Oracle9i database, coding of ETL procedures (PL/SQL and Informatica), and designing of Cognos Impromptu reports and the automation of production jobs.
  • Created conceptual, logical and physical data models in 3rd normal form and star schema using Erwin for staging database and for OLAP database.
  • Created complex mappings for moving the data from different sources into Oracle database and used Informatica tool as ETL tool for constantly moving the data from sources into staging area.
  • Implemented Slowly Changing Dimensions (Type 2: flag) to update the dimensional schema.
  • Monitored workflows and collected performance data to maximize the session performance.
  • Responsible for creating test cases to make sure the data originating from source is making into target properly in the right format.
  • Responsible for stress testing of ETL routines to make sure dat they don't break on heavy loads.
  • Involved in business analysis and technical design sessions wif and technical staff to develop requirements document, and ETL specifications.
  • Review existing data availability and quality and prepare detail documentations.
  • Developed various Mappings using expressions, aggregator, joiners, java, lookup and filters and slowly changing dimensions, lookups, filters, router, update strategy, reusable components, mapplets, session and workflows as per the design.
  • Data quality checking and interacting wif the business analysts.
  • Design processes dat transform and load data sources into fact and dimension tables according to best practices.
  • Implement database and ETL programs according to design and coding standards. Conduct peer design and code reviews.
  • Maintain the repositories for different environments like DEV, QA and PROD

We'd love your feedback!