Data Architect Resume
Chicago, IL
SUMMARY:
- Confidential is an expert level Data Architect/ETL Architect/Data Modeler with 14 years of system analysis, ETL design, and implementation experience of numerous ETL Projects.
- General Skills/Experience
- Requirements gathering and business rules analysis Logical/Physical database modeling.
- ETL Strategy & ETL Architecture Design.
- Data Modeling and Data Architecture
- Data Profiling using SQL and tools.
- Master Data Management and Reference Data Management
- Extraction/Transformation/Loading (ETL) design and implementation
- Error and Exception management
- ETL & Informatica mapping Performance Tuning.
- Design of ETL mapping & Tech Specifications
- Design of ETL Auditing and Balancing mappings
- Design ETL Architecture for ETL Systems
- Unit testing, Integration testing of Data to assure Data Quality
- Product Evaluations
- Work with vendors to establish the required infrastructure and frameworks for ETL & Informatica Installation
- Team leadership/mentoring/training
- Coordination of offshore development efforts
- Liaison between business users and development staff
- Shell scripting in Unix and ETL process Automation
- Iterative development processes
- “Agile” development processes
TECHNICAL SKILLS:
Business Problem: Analysis Dataware House Architecture and Design ETL Development
Leading and Mentoring: ETL Developers, ETL Development
TECHNIQUES: OOA, D & Programming Entity - Relationship (ER) Modeling, Agile techniques
ANALYSIS/DESIGN TOOLS: ER/Win, Database Platforms Oracle, DB2, MS SQL Server, Hive, Hadoop
Languages/Tools: SQL/PLSQL, INFORMATICA
Business Objects:, Unix Shell, Powershell
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
Data Architect
Responsibilities:
- Understand the current PARS data warehouse data structures and relationships.
- Understand the source systems sending data to the DWH
- Understand the SSIS ETL process that loads data into the DWH
- Understand reporting requirements not being met by the current DWH and how to ensure that the same is available in the new DWH
- Understand reports being generated from the DWH.
- Explore new Designs options for new PARS data warehouse and create presentation to explain the same
- ETL Tool evaluation process to explore if there is a need to buy new tool to replace SSIS.
- Analyze new source system risk data being provided by FACTSET.
- Analyze existing risk ETL SSIS packages to understand and document the logic.
- Design new Risk Data Model based on the new source system and data integration needs.
- Work with the business representative and gather requirements for reporting which is input for Data Model design.
- Design ETL logic for new SSIS packages
- Design and review PowerShell scripts for flat file processing
- Performance tuning of slow running SSIS packages and PowerShell Scripts.
- Performance tuning of SQL queries.
- Design and Develop SSIS packages for Data Quality (DQ ETL) checks to ensure that all data is loaded without any loss of information.
- Data Integration analysis for data warehouse integration for integration of data between TIAA and Nuveen
- Analyze existing Tableau Risk reports to analyze impact of the new source on existing reports and suggest ways to mitigate impact.
- Explore new design options for the new PARS data warehouse
- Attend Data Model review sessions and present data Model to the team and answers question about them and get the same approved.
- Create source to target mapping document for ETL code based of the data model created.
- Design SCD Type 1 and Type 2 tables.
- Generate DDL from Erwin and deploy in DEV and QA.
- Design and deploy indexes for performance and data integrity/quality
- Check in SSIS packages and DDL in TFS for deployment to production
- Analyze data quality issues generated by DQ ETL and resolve them.
- Documentation of new Data Model.
- Verify DDL deployment by using system tables like INFORMATION SCHEMA.
- Review ETL code created by developers and suggest changes and approve them
- Explore use of Azure Data Lake for storing data
- Explore use of Hive on Azure data lake to store data which is not processed and cleaned
- Conduct Proof of Concept to see if Hadoop and Hive is a good fit for the organization and how it fits in the overall Data strategy
Tools and Technologies: SQLSERVER2012, POWERSHELL, ERWIN, SSIS, SSRS, Hadoop, Hive, HDInsight
Confidential, Libertyville, IL
Data Architect
Responsibilities:
- Establish Data Governance rules for the new Data warehouse
- Establish Data modeling and Data Architecture standards
- Design Data Quality strategy to improve reliability of the data for end users.
- Design new Audit Balance and Control for Data Quality Management.
- Design reporting infrastructure for Business Objects.
- Attend Code review to review code created by Developers.
- Based on the source system design updates to the existing data models to be able to house data from FIS and creation of new data models.
- Analyze report mockups before designing reporting tables to ensure that reporting needs are met
- Impact analysis of existing BO reports to ensure that they do not break with table re-design
- Analyze universe design to understand impact of change of data model on existing reports.
- Build BO WebI reports to report metadata information needed by the business users and technical teams.
- Understand new source system FIS
- Work with Subject Matter Experts to clarify open and scenario questions.
- Based on the source system design updates to the existing data models in Erwin to be able to house data from FIS.
- Design change to keys/granularity of tables to accommodate new data.
- Design new tables and relationships to accommodate new business process in Erwin.
- Design Master Data Management data model for Informatica MDM for the party subject area.
- Add/Drop columns to the data model to achieve reporting requirement.
- Analyze data files received from Fiserv and Design data model for them.
- Get Erwin Data model reviewed and approved by the Modeling lead.
- Attend Design meeting to discuss open issues.
- Work with ETL and Modeling team members to ensure standardization
- Provide data stewardship with regards to data issues and work with the business to resolve the same.
- Data profiling of data to better understand data to create data models and data quality analysis.
- Design ETL specification documents and transformation rules for use by ETL developers.
- Code review of Informatica code and provide suggested updates.
- Design Data Quality strategy to improve reliability of the data for end users.
- Attend Code review to review code created by Developers.
- Write SQL scripts to dedup data and re-load them back into tables after undergoing data transformation.
- Tune SQL queries that will be embedded into applications and Business Objects
- Deploy DDL in DEV and QA
- Check in DDL in Visual Studio and version and label them as required.
Tools and Technologies: Informatica 9.x Oracle 11g, UNIX, ERWIN, SQLSERVER2008 and Business Objects.
Confidential
Data Architect
Responsibilities:
- Design of the data movement from source system to Hadoop and from there to the Data warehouse to enable reporting and analytics capabilities using Big Data platform
- Understood business requirements from the BSA and Business representative.
- Profiled source data to better understand the source data to create data model which would house the presentation layer.
- Created Hive internal or external tables defined with appropriate static and dynamic partitions.
- Sourced data from hive and loaded the same into the Oracle DWH.
- Work with Source System SME to understand the source system tables
- Design the presentation data marts for reporting on user activity
- Extracted the data from Hive to Oracle for reporting.
Tools: and Technologies: Cloudera Hadoop (HDFS/MapReduce), HIVE, SQOOP, Oracle, Linux, Oozie, AWS.
Confidential, Northbrook, IL
Data Architect
Responsibilities:
- Work with Business Analysts to understand the reporting requirement.
- Analyze existing reports and compare them to new requirements and understand change in functionality.
- Analyze source data to better understand the requirements
- Analyze existing data models to understand data relationship
- Modify existing data models in Erwin to meet reporting requirement
- Generate DDL’s and make the same available to the DBA for execution
- Attend DBCR meetings to work with Developers to understand their database requests and if relevant work with DBA to implement them
- Work with Senior data Architect to understand tasks to be completed
- Get modified data model reviewed by Enterprise architecture.
- Designed ETL mappings for performance and re-usability
- Performance tuned Informatica mappings.
- Conducted code reviews
- Reviewed data loaded in QA region during UAT process and provided input.
- Designed ETL balancing and control total system to ensure data quality.
Confidential, MI
Data Architect
Responsibilities:
- Design the Fusion Data warehouse to be able to accept integration data.
- Enhance the classic Data warehouse
- Design the reporting Data Mart to help achieve reporting goals.
- Analysis of data in the ODS/DWH to determine feasibility of implementation of requirements in the project
- Discussion with business to determine the reporting expectations
- Model the reporting data Mart fact tables and Dimension tables.
- Create Source to Target mapping Document for the ETL team to develop mappings.
- Build ETL mappings on a case by case basis for metadata management, ETL execution run time reporting
- Generate DDL’s for implementation by the DBA.
- Design views to enable reporting
- Develop Naming standards for data warehouse and Data Mart tables.
- Conduct Model review session to get the data model validated
- Conduct “show and Tell” sessions explaining the capability of the data model and the reporting that can be achieved from the same.
- Update Metadata document to reflect the new changes to the data warehouse and Data Mart
Tools: and Technologies: Pentaho ETL, Oracle 11g, PL-SQL, UNIX, ERWIN, Teradata
Confidential
Data Architect
Responsibilities:
- Attend Business requirement meetings with the BSA.
- Profile source data to better understand the data
- Provide business functionality context to developers for their work.
- Design Slowly Changing Dimensions for maintaining history of changes.
- Design 3NF data models for data integration.
- Update and maintain metadata document which is then uploaded to Enterprise Metadata Tool Rochade.
- Create ETL source to target mapping document for the development team
- Generate DDL from Erwin and make the same available to DBA for deployment in DEV.
- Send user communication to relevant parties 2 weeks before deployment of project in PROD.
- Provide support to report developers by answering questions that they have.
- Conduct model walkthrough with the Architecture team for model approval.
- Review data models created by other data modelers
Tools: and Technologies: Abinitio, Oracle 11g, PL-SQL, UNIX, ERWIN, Teradata
Confidential, Michigan, MI
Data Modeler/Business Analyst
Responsibilities:
- Gathered requirements from the Source System business analyst.
- Developed Source to Target mapping documents.
- Designed Target Data model and intermediate staging tables in ERWIN
- Provide DBA with DDL for implementation in Development.
- Obtain model approval from Architecture lead.
- Conduct model walkthrough with all developers
- Publish PDF version of Data model and TCR.
- Work with testers during QA to help them understand the data.
- Check in Data models into model mart
- Request abbreviations to be added to the NSM file
Confidential
Data Modeler/Business Analyst
Roles and Responsibilities
- Gap analysis to determine the missing fields in data foundation
- Work with business and gather the requirement to create Source to target document for the Data Mart.
- Data profiling of source to determine target column data type
- Enhance data model in Erwin to enhance the Claim Foundation tables.
- Designed Data Mart for Hospital Settlement Data Mart
- Conduct Data Model reviews with the Architecture team.
- Update Data Dictionary document and publish the same for use by settlement services.
- Design ETL architecture.
- Design tables for balancing dollar amounts.
Confidential .
Data Modeler/Business Analyst
Responsibilities:
- Work with Mainframe developers to obtain ETL logic from existing mainframe programs and documenting the same in source to target mapping document.
- Obtain ETL logic for new Medco file being integrated into the EDW from business users.
- Read existing ETL code and validate/Understand existing logic
- Hand off the Source to Target mapping document to offshore team.
- Support offshore team with open questions and issues with understanding business logic.
- Code review ETL code to ensure that it meets standards.
- Help Testers Validate data being loaded in QA.
- Close any open questions about logic from business users.
- Work with the Data modeler to enhance existing tables and build new ones.
Confidential, WI
ETL Architect
Tools: and Technologies: Informatica 8.x, Oracle 10g, PL-SQL, UNIX, ERWIN
Responsibilities:
- Interacted with business users, Data Architects converted requirements into actionable specifications and designs
- Designed ETL mappings for performance and re-usability
- Performance tuned Informatica mappings.
- Conducted code reviews
- Conducted Design reviews
- Developed Unix Code for file compression
- Documentation
- Knowledge transition to IBM team
- Interacting with offshore.
- Designed and developed audit and balancing mapping for the Navinet file interfaces.
- Implemented best practices for Informatica and enforced standards.
- Designed Views to replace hierarchy tables.
- Mentored Junior Developers
Confidential
ETL Lead Developer/ETL Architect
Responsibilities:
- Designed the Low Level Design for Airline subject area.
- Data quality profiling using SQL.
- Documentation of best practices.
- Created Data Models and target table structures using ERWIN.
- Attended meetings for design and planning.
- Developed the test strategy for QA.
- Designed audit and balancing mappings
- Designed “quality check” testing mappings.
- Designed “data cleaning routines” for flat files received from external vendors.
As Lead Informatica Developer
- Developed Complex and efficient Informatica V8 mappings.
- Created Oracle stored Procedures for achieving business requirement and testing data.
- Developed UNIX shell scripts to archive data and executing shell commands.
- Developed test cases for Informatica testing
- Interacted with business to collect the business requirements.
- Reverse engineered the code to capture existing requirements
- Coded business requirement enhancement
- Tested the code.
- Code review
- Developed processes to achieve standardization of code and process
- Mentoring junior developers.
As Informatica Adminstrator
- Worked as Second level Informatica administrator, creating users,
- Taking backup,
- Creating and maintaining users and profiles.
- Designing the Informatica Server Unix directories.
- Publishing and enforcing best practices for Informatica administration.
- Automation of routine processes
- Exploring opportunities of process improvement and suggesting the same.
As Oracle Developer
- Develop PL/SQL procedures (WEX module) for AP for reconciliation of charges made by Hotels
- Testing of WEX module.
- Development of Commission Receivable module for reconciliation of commission paid by the hotels to the expected check amount.
- Developed Oracle stored procedures to test data loaded by Informatica mappings.
- Developed Triggers to purge data at regular intervals.
- Created views to simplify development.
Tools: and Technologies Informatica 8.x, Oracle 10g, PL-SQL, UNIX, ERWIN
Confidential
ETL Architect/ETL Developer
Tools: and Technologies: Informatica 8.x, Oracle 9i, DB2, UNIX, ERWIN
Responsibilities:
As ETL Architect
- Designed ETL Load strategy to load the Data Mart in less than 3 hours.
- Requirement analysis for non-WMR Adapters and creation of Tech Specs.
- Designed Batch Job and shell script logic for running the informatica mappings
- Conducted tool evaluations for quick loading.
- Designed Test Strategy for testing the data.
As Lead Informatica Developer
- Designed Informatica 8 mappings to load the WMR tables
- Developed Informatica 8 mappings to load the WMR tables.
- Developed UNIX shell scripts that were used in the scheduler.
- Develop understanding of business process and table population logic and explain the same to developers.
- Load testing of mappings
- Code review and suggested best practices.
- Provided subject matter expertise.
As Oracle Developer
- Developed stored procedures to generate dummy data for the project.
- Test procedures for testing team to test if the data loaded by the informatica mappings is correct.
- Developed stored procedure for populating tables that could not be populated by informatica (Looping logic).
- Back up Database Administrator/Data Modeler
Confidential
ETL Architect
Tools: and Technologies: Informatica 7.x, DB2, UNIX, Mainframe
Responsibilities:
As ETL Architect
- Designed ETL Architecture for PMC project, including staging area and load strategy.
- Analyzed data and conducted gap analysis after profiling the data.
- Analyzed requirement for migration from Abnitio to Informatica.
- Designed the Auditing and Balancing process.
- Developed shell scripts to execute Informatica mappings
- Mentored and lead developers to design standardize & implement best practices for ETL processes in Confidential ETL Conversion Project.
- Conducted tool evaluations (Exeros) for Accenture and Confidential
- Designed Test Strategy for testing the data in Confidential ETL Conversion Project.
- Provided subject matter expertise.
- Module lead for Conversion Deliverables
- Designed processes to achieve standardization
Confidential
ETL Architect/Team Lead
Responsibilities:
- Interacted with business users, Data Architects converted them into actionable specifications and designs
- Designed the Data Mart Staging Architecture, Mapping Design and ETL load process
- Designed Complex Audit and Balancing capabilities.
- Worked with the Data Modeler in the design of the data Model.
- Data profiling to better understand source data to ensure that it meets need of ETL process.
- Designed the shell script logic.
- Provided know how to performance tune ETL mappings.
- Designed the Confidential repository structure.
- Adopted and communicated new idea’s and best practices.
- Provided status reports.
- Ensured timely delivery of project deliverables
- Undertook defect prevention and continuous improvement activities.
Tools: and Technologies: Erwin 4.1, Informatica 8.x, DB2, UNIX, Mainframe.
Confidential
ETL Lead
Responsibilities:
- Designed Control Systems to calculate the load time and load performance and used them to reduce load window.
- Designed the Support manual for production support.
- Designed reports to capture production failures.
- Install and maintenance of local Informatica Server and repository server.
- Designed migration process to QA and production.
- Created documents and providing presentation of new processes for user training and new hire orientation.
- Lead a team of 7 people for support and enhancement.
Confidential
ETL Developer
Responsibilities:
- Interacted with business users, technology and external vendors to gather business requirements and converted them into actionable specifications and designs.
- Created source system profiling document due to absence of any source system documentation to help better understand the source system.
- Collected requirements and designed staging area and load strategy.
- Interacted with the Data Modeler to suggest structural changes to tables.
- Provided ETL logic to capture and store changed data.
- Provided ETL logic for backfilling of data.
- Install and maintenance of Informatica Server and repository server.
- Lead various technical resources and team members to achieve project goals and deliver solutions.
- Creating documents and providing presentation of new processes for user training and new hire orientation
- Provide controlled access to various user groups to the information hub.
- Ensured proper documentation of all codes.
- Designed the re-start logic for Informatica Batch Jobs.
Tools: and Technologies: Informatica 6.x, DB2, UNIX, Mainframe.
Confidential
Oracle Developer
Responsibilities:
- Requirement analysis for user requirements.
- Designed Logic for Oracle Procedures and ETL code
- Develop Informatica mappings and PL-SQL code.
- Analysis of production support issues with the aim to prevent them from reoccurring resolve them
- Developing test strategy for testing of procedures
- Mentoring developers
- Ensure timely delivery of deliverables.
- Tool evaluations.
- Code review
- Creation status report and attending status calls.
- Creating a process to reduce defects with the help of defect prevention meetings.
- Ensuring timely delivery of code.
- Migration of code from Development to QA.
- Reverse engineer existing code to document the same
- Code review.
- Code development for payment of incentives to DSA’s for credit card sales.
- Code development for payment of incentives to DSA’s for loan sales.
- Knowledge transfer to WMR team.
- Mentoring developers
- Developing test strategy for testing of procedures