We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

2.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY:

  • Over 8+ years of Senior Data Architect/Modeler/Analyst with IT professional experienced in Data Analysis, Data Modeling, Data Architecture, designing, developing, and implementing data models for enterprise - level applications and systems.
  • Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
  • Highly experienced with budgeting, reporting, forecasting and modeling including dashboard reports and pivot tables.
  • Having good knowledge of data security standards - FFIEC, PCI-DSS, GLBA and experience implementing solutions beyond analysis and assessments that meet requisite compliance.
  • Hands on experience of the CDID, TIC, EDS and Lognet for the conversion team to load the data thru Datastage jobs.
  • Experienced in using various Hadoop infrastructures such as Map Reduce, Hive, Sqoop, and Oozie.
  • Hand on experience configuring RedShift, ElasticSearch and Dynamo DB with EC2 Instances.
  • Experience in BI/DW solution (ETL,OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team
  • Responsible for detail architectural design and data wrangling, data profiling to ensure data quality of vendor data, Source to target mapping.
  • Hand on experience in Threat Modeling of complex security systems using techniques such as STRIDE, DREAD, OCTAVE.
  • Extensive experience of map lineage of CDEs from analytic output to different Netezza sources.
  • Strong and Excellent experience in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification, identifying data mismatch, Data Import, and Data Export through the use of multiple ETL tools such as Pentaho, Talend and Informatica Power Centre.
  • Experience in developing Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like Erwin, ER/Studio and Power Designer
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
  • Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Having knowledge of cryptographic standards and algorithms such as AES, SHA, DES; PKCS standards.
  • Highly skilled in the usage of ETL tools for DataStage (Talend, Informatica, SSIS, SSRS) developing features of Teradata PDE, Amp, Bynet, PE, vDisk and Virtual storage system (VSS).
  • Experience of migrating data from RDBMS to RedShift
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in Informatica Power Center 9x/8.x/7x with Oracle SSIS and SQL Server Data Warehousein Microsoft DW/BI environments
  • Worked on Informatica Power Center tools-Designer, Repository Manager, Workflow Manager.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Assist in creating communication materials based on data for key internal /external audiences.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin, Rational System Architect, IBM Infosphere Data Architect, ER Studio and Oracle Designer

Database Tools: Microsoft SQL Server12.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access

Version Tool: VSS, SVN, GIT.

Project Execution Methodologies: Agile, Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports Packages Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server

Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester.

Tools: OBIE 10g/11g, SAP ECC6 EHP5, Go to meeting, Docusign, Insidesales.com, Share point, Mat-lab.

ETL/Data warehouse Tools: Informatica 9.6/9.1/8.6.1/8.1, Informatica cc360, SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2.

Testing and defect tracking Tools: HP/Mercury (Quality Center), Quick Test Professional, Performance Center, Requisite, MS Visio.

Operating System: Windows, Unix, Sun Solaris

PROFESSIONAL EXPERIENCE:

Confidential, Tampa, FL

Sr. Data Architect/Modeler

Responsibilities:

  • Worked as a Data Modeler/Architect to generate Data Models using Erwin and developed relational database system.
  • Creating the Logical and Physical OCTAVE Data models for the CDID, TIC, EDS and Lognet for the conversion team to load the data thru Datastage jobs.
  • Involved in requirement gathering and analysis with Business analyst, systems analysts, Developers and DBA and translated them into detailed reporting requirements.
  • Performed data standardization to drive meaningful business reports based on consultation and analysis of internal and external customers and product needs.
  • Execution of the report in SSRS runtime environment for a set of values
  • Developed customized reports and dashboards utilizing Mondrian OLAP, CDE plugins, HTML, CSS and JavaScript.
  • Modeling of complex security systems using techniques such as STRIDE.
  • Involved in creation of generic sql scripts to implement changes on tables, constraints, foreign keys and indexes across all environments as requested by business architects thru database modification change request (DMCR) system.
  • Routinely perform the following duties: analyze computer networks for security purposes using a variety of network analysis and data security monitoring tools .
  • Use Informatica cloud customer 360 to help marketing maximize lead gen ROI and sales CRM.
  • Installed and implemented RACF on a 3090 VM/MVS, OCTAVE mainframe
  • Using TIBCO Business Works to create and invocate SOAP over HTTP Web Service.
  • Used vlookups, macros and pivot tables for data mining.
  • Worked extensively with importing metadata into Hive and migrated existing tables and applications to work on Hive and Spark.
  • Hands on experience in Netezza, SQL Server, DB2, Oracle 11g/10g, Teradata, databases.
  • Executing change controls on DB2/ZOS environment through TSO, SPUFI.
  • Expertise in Data Manipulations using SAS data step, such as SAS Formats/Informats, Merge, Procedures like PROC APPEND, PROC DATASETS, PROC SORT, PROC TRANSPOSE
  • Creation of Report Layout and configuration of report environment using SSRS
  • Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
  • Build and Deployed EAR files using TIBCO Designer, Administrator.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Investigate reported data security incidents and reported machine compromises using security tools such as Vectra, Stealth Watch, Splunk, Palo Alto Firewall and Endgame.
  • Developed Spark jobs and Hive Jobs to summarize and transform data.
  • Implemented Change Data Capture technology in Talend, DREAD in order to load deltas to a Data Warehouse.
  • Involved in extraction, transformation and loading of data directly from different source systems (flat files/Excel/Oracle/MSSQL/Teradata) using SAS/SQL, SAS/macros.
  • Hands on experience in Implementing/building big data and Analytical Architecture on Amazon Web Services (AWS)
  • Reviewed the files of hundreds of mortgage companies using vlookup and pivot tables.
  • Implemented integration of SSL based Web Service calls using TIBCO BW.
  • Used Spark for interactive queries, processing of streaming data and integration with popular NoSQL database for huge volume of data.
  • Analyze computer networks for data security purposes using a variety of network analysis and securitymonitoring tools; respond to incidents; develop and evaluate data security tool.
  • Experienced in Cloud automation using AWS Cloud Formation templates, Python, Ruby, Chef (DevOps), Vagrant.
  • Migration of the MDM process and data from existing Profisee application to Informatica MDM (CC360) - rewriting the mastering process and scheduled inbound and outbound feeds.
  • Worked on Common Desktop Environment(CDE 2.1) along with libraries such as Tooltalk and Motif.
  • Researched and compared EMC / VMware's GemFire and SQLFire products and proposed the initial solution
  • Generate Compliance, Diversion/Detection, measurement and control relay(DMCR) and Trending, sales reports for daily process and ad-hoc requests using BrioQuery Explore
  • Worked as a Hadoop consultant on (Map Reduce/Pig/HIVE/Sqoop).
  • Worked on loading data from console log files & Oracle to Redshift for Tableau reporting.
  • Used the data Integration tool Pentaho for designing ETL jobs in the process of building Datawarehouses and Data Marts.
  • Worked on map lineage of CDEs from analytic output to Netezza sources.
  • Involved in architecting the data integration, created the data model for the data warehouse, advised the client on the importance of data governance to consistently maintain the data quality
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Respond to and implement data security/incident/firewall requests .
  • Develop a Householding model with MDM as the source .
  • Physical Migration of the Developed reports onto the Development SSRS server
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Maintain data mapping documents, business matrix and other data design artifacts that define
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
  • Analyzing, designing, developing, implementing and maintaining ETL jobs using IBM Infosphere Datastage and Neetzza
  • Created large datasets by combining individual datasets using various inner and outer joins in SAS/SQL and dataset sorting and merging techniques using SAS/Base.
  • Subject Matter Expert in building analytics on Real-time frameworks such as Kinesis or Kafka, and/or Data Warehousing
  • Configure the MDM hub according to the business needs.
  • Lead implementation of Fuel ODS, Invoice Data Repository and SME to Purchase Order Process rewrite.
  • Debugged and compiled a DBMS release written in PL/I, running in CMS, TSO, and MVS environments. Developed CAD interface programs in C and IBM JCL system installation
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Applied data analysis, data mining and data engineering to present data clearly.
  • Ensured high-quality data and understand how data is generated out experimental design and how these experiments can produce actionable, trustworthy conclusions.
  • Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
  • Worked in writing HadoopJobs for analyzing data using Hive, Pig accessing Text format files, sequence files, Parquet files.
  • Develop the standard and comparison reports by Datastage for the legacy corporate systems with individual facility data. Legacy systems are Customer data (CDID), The Item Catalog(TIC), Part Numbers(EDS) and Lognet (Vendor).
  • EC2 instance / Redshift /S3 launching/HANA One, configuration and maintenance
  • Reverse engineered some of the databases using Erwin.
  • Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle).
  • Implemented Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data.
  • Developed PII VB solution for identification of non-PII and Suspect PII and subsequent verification with SMEs to identify elements requiring masking/encryption.
  • SME for commerce platform, customer, products and sales functional data.
  • Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
  • Has complete knowledge on AWS services offered for collecting, processing, storing and analyzing big data (AWS Kinesis Streams, Lambda, Elastic Map Reduce, DynamoDB, Redshift, Elasticserach, QuickSight)
  • Loading large data from the different Mainframe (JCL) into Sybase using BCP utility of Sybase.
  • Design and development of ETL routines to extract data from heterogeneous sources and loading to Actuarial Data Warehouse.
  • Good Experience in creating cubes by using Pentaho Schema Workbench
  • Install the MDM Hub / Cleanse servers
  • Mainframe Software development in COBOL and JCL for z/OS System for Vanguard Institutional Services
  • Advises/leads projects involving the ETL related activities and the migration or conversion of data between enterprise data systems. Coordinates interactions between central IT, business units, and data stewards to achieve desired organizational outcomes.
  • Gathered and analyzed existing physical data models for in scope applications and proposed the changes to the data models according to the requirements.
  • Exported business data to Excel for further analysis utilizing SSRS
  • Wrote ETL using Talend to pull data from various spreadsheets, MySQL, Informix, SQL Server and Oracle to load data to MySQL and Oracle databases in the Amazon Cloud and on-premise.
  • Advises on and enforces data governance to improve the quality/integrity of data and oversight on the collection and management of operational data.
  • Used Hive to analyze data ingested into HBase by using Hive-HBase integration and compute various metrics for reporting on the dashboard
  • Created jobs to extract ECC XML messages using various stages MQ connector, XML transformation stage to load into Netezza tables.
  • Build both real time and batch oriented big data solutions using services such as Amazon Elastic Compute Cloud (EC2), S3, DynamoDB, Elastic Map Reduce (EMR), Kinesis, Redshift, and Relational Database Service (RDS)
  • Conduct research on security policy; consult with University customers on security related matters; keep up with information-data security related announcements.
  • Develop a search services framework with MDM as the source
  • Used IBM Mainframe to Oracle RAC/Linux environment migration, Oracle Fusion Middleware products, focusing on SOA, BPEL, and Integration technologies in relation to the corporate Mainframe ( Cobol, JCL, DB2) to Open systems Migration ( Java, Ksh, Oracle SOA Suite, and Oracle RDBMS)
  • Able to guide / partner with VP / Directors for architecting solutions for the Big data Organization
  • Integrated crystal reports using Erwin Data Modeler.
  • Created and Executed Stored Procedure for generation of SSRS Reports
  • Performed system administration for IBM pure Data for analytics powered by Netezza.
  • Worked with NoSQL MongoDB and heavily worked on Hive, Hbase and HDFS
  • Us Erwin to support for TeradataV15 and SSL.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Involved in designing Logical and Physical data models for different database applications using the Erwin.
  • Experience with AWS ecosystem (EC2, S3, RDS, Redshift).

Environment: Oracle 12c, MS-Office,Informatica, SQL Architect, TOAD Benchmark Factory, Teradatav15, SQL Loader, Big Data SharePoint, ERwin r 9.64, DB2, MS-Office, SQL Server 2008/2012.

Confidential, Detroit, MI

Sr. Data Analyst/Modeler

Responsibilities:

  • Gathered and translated business requirements, worked with the Business Analyst and DBA for requirements gathering, business analysis, and testing and project coordination.
  • Extensively used Erwin 9.1 for developing data model using star schema methodologies
  • Experienced in Using CA Erwin Data Modeler (Erwin) for Data Modeling (data requirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems and data marts.
  • Extensive Experience working with business users/SMEs as well as senior management.
  • Performed statistical data analysis, generated ad-hoc reports, tables, listings and graphs using tools such as SAS/Base, SAS/Macros, SAS/Graph, SAS/SQL and SAS/STAT.
  • Being involved in other projects utilizing IMS, DB2 Z/OS WITH TOOLS, TSO, PROFS on IBM 3090 and Lotus 1-2-3 (BASIC) on IBM PC.
  • Respond to reported machine compromises; maintain appropriate documentation during investigations; investigate reported data security incidents; maintains documentation; and writes technical documents and advisories (security advisories).
  • Generated multiple Enterprise reports using SSRS from SQL Server Database (OLTP) and SQL Server Analysis services Database (OLAP) and included various reporting features such as group by, drill downs, drill through, sub-reports, navigation reports (Hyperlink), etc.
  • In parallel to Development acted as a Talend Admin: Creating Projects/ Scheduling Jobs / Migration to Higher Environments & Version Upgrades
  • Worked on Change Data Capture process to replicate transactional data from Sterling IBM DB2 database to CDW Netezza database.
  • Responsible for loading Data pipelines from web servers and Teradata using Sqoop with Kafka and Spark Streaming API .
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Involved in Java, J2EE coding and job controlling with mainframe JCL
  • Participated in requirements session with IT Business Analysts, SME's and business users to understand and document the business requirements as well as the goals of the project.
  • Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Involved in writing code using Base SAS & SAS/Macros to clean and validate data from tables.
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Extensively used Erwin r9.1 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Worked collaboratively to manage build outs of large data clusters and real time streaming with Spark.
  • Own the design, development, and maintenance of on-going metrics, reports, analyses, dashboards, etcIntegrate with partner ETL, analytics and visualization (BI Tools)
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Created different Parameterized Reports (SSRS 2008/2012) which consist of report Criteria in various reports to minimize report execution time and to limit the number of records required .
  • Involved in Data Extraction for various Databases & Files using Talend.
  • Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems .
  • Developed a standard, structured and best practice approach to the selection of new technology including client data security .
  • Developed Spark streaming application to pull data from cloud to Hive table.
  • Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
  • Responsible for developing and supporting a data model and architecture that supports and enables the overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
  • Used SAS Procedures like PROC FREQ, PROC SUMMARY, PROC MEANS, PROC SQL, PROC SORT, PROC PRINT, PROC Tabulate, PROC UNIVARIATE, PROC PLOT and PROC REPORT to generate various regulatory and ad-hoc reports.
  • Experience with scripting for automation (e.g., Unix shell scripting, Python, Perl, Ruby)
  • Excellent experience and knowledge on data warehouse concepts and dimensional data modeling using Ralph Kimball methodology.
  • Participated in requirements session with IT Business Analysts, SME's and business users to understand and document the business requirements as well as the goals of the project.
  • Enhancement and support in analysis, design and programming of IBM Mainframe 390 Z/OS MVS, TSO based with COBOL and IMS DB/DC hierarchical database system, VSAM, File Aid, Sync sort application systems for the contract clients
  • Performance tuning the Spark jobs by changing the configuration properties and using broadcast variables .
  • Supported JCL implementation for User acceptance testing and Production Simulation Testing.
  • Designed and troubleshoot Pentaho data integration (kettle) jobs, transformations.
  • Designing Star Schema and Snow Flake Schema on Dimensions and Fact Tables
  • Expertise in Informatica, DB2, Microstrategy and UNIX Shell scripting
  • Used Talend for Extraction and Reporting purpose.
  • Extensive experience as Hadoop Developer with strong expertise in Hive, Pig, Spark, and Sparkstreaming .
  • Worked with Data Vault Methodology Developed normalized Logical and Physical database models
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.
  • Strong understanding of ETL concepts and experience building them with large-scale, complex datasets
  • Analysis, reporting and tracking of defects on a daily basis.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Implemented Data Vault Modeling Concept solved the problem of dealing with change in the environment by separating the business keys and the associations between those business keys, from the descriptive attributes of those keys using HUB, LINKS tables and Satellites.
  • Responsible for maintenance and enhancements in reporting systems using SAS/BASE, SAS/MACRO, SAS/ STAT, and SAS/GRAPH.
  • Used Amazon EC2 instance to configure Cron jobs and deploy Talend workflows.
  • Created Hive External tables and loaded the data into tables and query data using HQL
  • Developed ETL data pipelines using Spark, Spark streaming and Scala.
  • Wrote and running SQL, BI and other reports, analyzing data, creating metrics/dashboards/pivots/etc.
  • Gather and analyze business data requirements and model these needs. In doing so, work closely with the users of the information, the application developers and architects, to ensure the information models are capable of meeting their needs.
  • Working along with ETL team for documentation of transformation rules for data migration from OLTP to warehouse for purpose of reporting.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.

Environment: Python, MySQL, PostgreSQL,, SQL Server, Erwin, Informatica, AWS Redshift, RDS, Big Data, JDBC, NOSQL, Spark, Scala, Star Schema, Snow Flake Schema .

Confidential, Bronx, NY

Sr. Data Analyst/Modeler

Responsibilities:

  • Developed the relogical data models and physical data models that capture current state/future state data elements and data flows using ER Studio.
  • Reverse Engineered DB2 databases and then forward engineered them to Teradata using ER Studio.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards .
  • Performed statistical analysis and data management on study data by utilizing appropriate statistical methods using SAS and SAS tools.
  • Involved in meetings with SME (subject matter experts) for analyzing the multiple sources.
  • Involved in SQL queries and optimizing the queries in Teradata.
  • Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
  • Expedited the ETL process of health insurance payer service fund data into SQL Server databases through SPSS Modeler stream creation. In addition, accelerated reformatting of ETL schema to account for payer modifications of data file formats.
  • Developed the design & Process flow to ensure that the process is repeatable.
  • Performed analysis of the existing source systems (Transaction database)
  • Involved in maintaining and updating Metadata Repository with details on the nature and use of applications/data transformations to facilitate impact analysis.
  • Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata .
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flatfiles, MS SQL Server with high volume data
  • Designed Logical & Physical Data Model /Metadata/ data dictionary using Erwin for both OLTP and OLAP based systems.
  • Direct the development, testing and maintenance of SAS-EBI reports
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule .
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data
  • Worked extensively on ER Studio for multiple Operations across Atlas Copco in both OLAP and OLTP applications.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
  • Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data.
  • Co-ordinate all teams to centralize Meta-data management updates and follow the standard Naming Standards and Attributes Standards for DATA &ETL Jobs.
  • Finalize the naming Standards for Data Elements and ETL Jobs and create a Data Dictionary for Meta Data Management.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data

Environment: ER Studio, Business Objects XI, Rational Rose, Data stage, MS Office, MS Visio, SQL, SQL Server 2000/2005, Rational Rose, Crystal Reports 9, SQL Server 2008, SQL Server Analysis Services, SSIS, Oracle 10g,

Confidential, Minnetonka, MN

Sr. Data Analyst/Modeler

Responsibilities:

  • Designed Logical Data Models and Physical Data Models using Erwin.
  • Developed the Conceptual Data Models, Logical Data models and transformed them to creating schema using ERWIN.
  • Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization
  • Generated a separate MRM document with each assignment and shared it on SharePoint along with the PDF of updated data models .
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Created a list of domains in Erwin and worked on building up the data dictionary for the company
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
  • Analyzed the physical data model to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being a Data Analyst.
  • Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments.
  • Create conceptual data models including categories, subject areas, key entities and entity relationships conforming to the applicable data standards within the scope of the business requirements and leveraging prior health care payer experience
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Worked on data mapping process from source system to target system. Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin
  • Developed enhancements to Mongo DB architecture to improve performance and scalability.
  • Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Performed data cleaning and data manipulation activities using NZSQL utility.

Environment: Oracle Data Modeler, Teradata 12, SSIS, Business Objects, Erwin r8.2, Oracle SQL Developer, SQL Server 2008, ER/Studio Windows XP, MS Excel.

We'd love your feedback!