Data Architect Resume
Portland, ME
PROFESSIONAL SUMMARY:
- 9+ years of Industry experience as a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
- Expert in writing SQL queries and optimizing the queries in Teradata, Netezza, Oracle & SQL Server.
- Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling
- Excellent knowledge in Data Analysis, Data Validation, Data profiling, Data Cleansing, Data Verification and identifying data mismatch.
- Performed data analysis and data profiling using complex SQL on various sources systems including Oracle, Netezza and Teradata.
- Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport.
- Solid knowledge on Netezza nzload, nzsql, Netezza architecture.
- Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
- Excellent knowledge on Perl & UNIX.
- Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
- Experienced in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
- Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing and data manipulation.
- Extensive experience in data modeling using Erwin, Star Schema Modeling and Snowflake, modeling, FACT and Dimensions tables, physical and logical modeling.
- Expertize to design project artifacts like ETL specifications, data mapping, data dictionaries & metadata repositories.
- In-depth knowledge of designing Tables, Views, Indexes, Table Partitioning, Collections, Materialized Views, Constraints and Nested Tables.
- Expertise in loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
- Experience in designing error and exception handling procedures to identify, record and report errors.
- Expert in Designing, Implementing data warehouse structures (Staging, Facts, Dimensions and Aggregate tables) using Teradata, Netezza & Oracle databases using STAR, SNOW-FLAKE, and STAR-FLAKE (HYBRID) data warehouse methodologies.
- Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
- Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects
- Have good exposure on working in offshore/onsite model with ability to understand and/or create functional requirements working with client and also have Good experience in requirement analysis and generating test artifacts from requirements docs.
- Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
- An excellent team player& technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.
TECHNICAL SKILLS:
Programming Languages: SQL, PL/SQL, UNIX shell Scripting, VBScript, PERL, AWK, SED
Databases: Oracle 11g/10g/9i, Teradata R12 R13, MS SQL Server 2005/2008, MS Access, Netezza Twinfin
Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Load Runner, Quick Test Professional, Performance Center, VU Scripting, Business Availability Center), Requisite, MS Visio & Visual Source Safe, Salesforce
Operating System: Windows Vista/XP/2000/98/95, Dos, Unix
ETL/datawarehouse Tools: Informatica 9.5/9.1/8.6.1/8.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SAP Business Objects XIR3.1/XIR2, Web Intelligence
Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin
Tools & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant
Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSSPROFESSIONAL SUMMARY:
Confidential, Portland, ME
Sr. Data Analyst/Data Architect
Responsibilities:
- Created data dictionaries and/or other types of metadata repositories for old and new schema
- Responsible for gathering and translating business requirements into data queries (SQL), models, and tools to allow for insightful data analysis
- Wrote advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
- Provided input in developing quality assurance standards in regards to functional, regression, performance and load testing
- Documented all data mapping and transformation processes in the Functional Design documents based on the business requirements.
- Created data trace map and data quality mapping documents.
- Performed Data Profiling, Data Cleansing, Data Auditing and Data Quality and used Erwin for data modeling.
- Performed extensive task which include Data Quality Analysis, Data lineage and Data Standardization data structures, data base design, data warehouses, business intelligence/analytic tools, SQL, ETL tools, and data integration methods
- Worked on data profiling from system to system for integration and for extraction/transformation/loading rules with limited user interface
- Solid expertise on data modeling concepts, data cardinality and normalized/de-normalized designs
- Used extensive SQL for data profiling/analysis to provide guidance in building the data model
- Created interface specifications documents, business requirements data level mapping and ETL rules for Application Development
- Responsible and accountable for managing all Systems Analysis phase deliverables at the Project levels such as; traceability matrix, requirement sign off, functional specification documentation creation and sign off, reviewing test scenarios to ensure complete coverage of release contents, management of change controls and integration into existing work items in flight.
- Defined and assisted functional technology based solutions to meet business requests within established technology frameworks and project timelines.
- Identified, analyzed and resolved complex system/process issues.
- Participated and/or facilitated meetings with stakeholders during all phases of the System Development Life Cycle
- Provided support to team members to assist in identifying data sources, strategies for completing queries, and clarification of requestor needs.
Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R12/R13.10, Teradata SQL Assistant 12.0
Confidential, MN
Sr. Data Modeler/ Data Analyst/Data Architect
Responsibilities:
- Worked on extracting, aggregating, pivoting, analyzing, and presenting data in various formats using various software tools.
- Worked on data quality, data organization, metadata, and data profiling
- Mapped legacy data fields to interface file fields, developed business rules, develop value maps, cleaned legacy data, assist with converted data clean-up, assisted in audits, performed manual data audits, and perform automated data audits
- Framed and conducted complex analysis and tests using large, complex (not always well-structured, highly variable) data sets
- Drawn conclusions and effectively communicated findings with both technical and non-technical team members
- Translated functional requirements into Data Integration Technical Design documents in alignment with DI technical architecture/framework.
- Worked as Data analyst/Data Mapper to validate logical source to target mapping and help complete the ‘Physical’ source to target mapping.
- Designed, built per design & mapping documentation and support deployment and execution of multiple data integration applications in Informatica, Teradata SQL/Scripting, K Shell Scripting & Control M scheduler.
- Performed through Unit testing of the code developed and document test cases/test results.
- Participated in peer code reviews & Tech leader code reviews using ‘Code review checklist’ and document discrepancies as code review/Unit defects.
- Managed and supported QA/UAT through SDLC and provide timely support & resolution on Design and code defects.
- Collaborated with architects, business analyst, quality assurance and production support during respective SDLC phases of a project and ‘RTE’ executions.
- Insured projects are successfully completed according to agreed terms for scope, cost, quality and schedule.
- Contributed, Developed and Maintained best practices for data integration application development.
Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica9.5/8.6/9.1 Oracle 11G, Teradata V2R12/R13.10, Teradata SQL Assistant 12.0
Confidential, Confidential, Germantown, MD
Sr. Data Analyst/Data Modeler
Responsibilities:
- Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Trouble shooted test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Perform data reconciliation between integrated systems.
- Metrics reporting, data mining and trends in helpdesk environment using Access
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
- Extensively used MS Access to pull the data from various data bases and integrate the data.
- Worked on SAS for Data Analysis.
- Assisted in the oversight for compliance to the Enterprise Data Standards
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, QL Server 2005 with high volume data
- Worked with Excel Pivot tables.
- Create and Monitor workflows using workflow designer and workflow monitor.
- Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
- Analysis on Mainframe data to generate reports for business users.
- Identify & record defects with required information for issue to be reproduced by development team.
Environment: PL/SQL, Business Objects XIR2, ETL Tools Informatica 8.6/9.1 Oracle 11G, Teradata V2R12/R13.10, Teradata SQL Assistant 12.0
Confidential, Pittsburgh, PA
Sr. Data Analyst/Data Modeler
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
- Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
- Responsible for different Data mapping activities from Source systems to Teradata
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
- Executed campaign based on customer requirements
- Followed company code standardization rule
- Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
- Involved in Teradata SQL Development, Unit testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
- Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata
Confidential, Colorado Springs, CO
Data Analyst
Responsibilities:
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
- Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
- Tested several stored procedures.
- Validated several Business Objects reports. Reviewed and tested business requests for data and data usage.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
- Responsible for Data mapping testing by writing complex SQL Queries using WINSQL
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Validating the data passed to downstream systems.
- Worked with Data Extraction, Transformation and Loading (ETL).
- Involved in testing data mapping and conversion in a server based data warehouse.
- Involved in testing the UI applications
- Involved in Security testing for different LDAP roles.
- Tested whether the reports developed in Business Objects are as per company standards.
- Used Quality Center to track and report system defects
- Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
Environment: Informatica 8.1/7.1, Informix, DB2, Java, Business Objects, SQL, SQL Server 2000/2005, Teradata V2R6 (MLOAD, FLOAD, FAST EXPORT, BTEQ), TeradataSQL Assistant 7.0,Toad, XML, XSLT, IBM AIX 5.3, UNIX, Shell Scripting, WINSQL, Rumba UNIX Display, Quality Center 8.2
Confidential
Data Analyst
Responsibilities:
- Involved with Design and Development team to implement the requirements.
- Developed and Performed execution of Test Scripts manually to verify the expected results
- Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation
- Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
- Involved in extensive DATA validation using SQL queries and back-end testing
- Used SQL for Querying the database in UNIX environment
- Developed separate test cases for ETL process (Inbound & Outbound) and reporting
- Defined the Scope for System and Integration Testing
- Prepares and submit the summarized audit reports and taking corrective actions
- Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
- Document and publish test results, troubleshoot and escalate issues
- Preparation of various test documents for ETL process in Quality Center.
- Involved in Test Scheduling and milestones with the dependencies
- Functionality testing of email notification in ETL job failures, abort or data issue problems.
- Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
- Created and executed test cases for ETL jobs to upload master data to repository.
- Responsible to understand and train others on the enhancements or new features developed
- Conduct load testing and provide input into capacity planning efforts.
- Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
- Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
- Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting