We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

2.00/5 (Submit Your Rating)

Philadelphia, PA

SUMMARY

  • Over 8+ years of Professional Experience in different environments and platforms including Data Analysis, datawarehousing, datamodelling,datamapping, data lineage,dataprofiling, Informatica Data Quality (IDQ), Informatica Data Analyst (IDA), Informatica Power Center and Client - Server applications in Pharmaceutical Industry.
  • Strong IT experience in the field of Data analysis, ETL Development, Data Modeling, Data Validation, Predictive modeling, Data Visualization, Web Scraping. Adept in statistical programming languages like R and Python including Big Data technologies like Hadoop, Hive.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Ab Initio and Informatica Power Center, Informatica Data Quality (IDQ), Informatica Analyst, Informatica Master Data Management (MDM), Informatica Meta Data Manager, Data Stage, SSIS.
  • Extensively used Informatica PowerCenter, Informatica Data Quality (IDQ) as ETL tool for extracting, transforming, loading and cleansing data from various source data inputs to various targets, in batch and real time.
  • Extensive hand-on experience with CRO/vendor oversight of pharmacovigilance deliverables and database.
  • Experienced in Installation, Configuration, and Administration of Informatica Data Quality and Informatica Data Analyst.
  • Extensive experience in generating data visualizations using R, Python and creating dashboards using tools like Tableau.
  • Expert on maintaining and managing Microsoft Power BI and Tableau reports, dashboards and publishing to the end users for Executive level Business Decision.
  • Good Knowledge on SAS Macros, SAS SQL, SAS Stat and SAS Graph in UNIX environment.
  • Extensive experience with OLTP/OLAP System and E - R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Expertise with Informatica Data Quality (IDQ) toolkit, data cleansing, data matching, data conversion, data profiling, exception management, reporting and monitoring functions.
  • Excellent experienced in data Extraction, Transformation and Loading ETL using various tools such as SQL Server Integration Services SSIS and SQL Replication.
  • Proficient in developing parameterized reports, cascading reports, drill down reports, drill through reports, sub reports and ad-hoc reports in SSRS by pulling data from SSAS or relational database.
  • Extensive experience in Strategic development of a Data Warehouse and in Performing Data Analysis and Data Mapping from an Operational Data Store to an Enterprise Data Warehouse.
  • Good understanding of SSRS with Report authoring, Report management, Report delivery and Report security and Created reports with Analysis Services Cube as the data source using SSRS.
  • Expertise in Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Hands-on Experience in Power BI Power Pivot, Power integrated with Share Point and in creating dashboards in Power BI and Tableau visualization tools.
  • Good Knowledge in the ETL (Extract, Transform and Load) of data into a data ware house/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).
  • Expertise in Installation and Configuration of Informatica MDM Hub, Cleanse and Match Server, Informatica Power Center.
  • Experienced in full life cycle MDM development including data analysis, database design, data mapping and data load in batch.
  • Good Knowledge in Defining the Match and Merge Rules in the MDM Hub by creating Path components, Columns and rules.
  • Knowledge in Business Intelligence tools like Business Objects, Cognos and OBIEE.
  • Strong proficiency in SQL databases, trading systems and accounting applications, Informatica ETL, Fact Set, Morningstar, Murex, Koufax and File Net.
  • Experience in writing SQL queries and optimizing the queries in Sybase, Oracle and SQL Server.
  • Experience with Teradata as the target for the data marts, worked with BTEQ, Fast Load and Multi Load.
  • Good knowledge on Teradata Manager, TDWM, SQL assistant and BTEQ.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, and Multi Load to export and load data to/from different source systems including flat files.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Hands on experience using query tools like TOAD, SQL Developer, PL/SQL developer, Teradata SQL Assistant and Query man.
  • Created Informatica mappings and workflows for loading data from different sources to Fin Master
  • Expert in TSQL DDL/DML, perform most of the SQL Server Enterprise Manager and Management Studio functionality using T-SQL Scripts and Batches
  • Experience in working with Data Management and Data Governance based assignments.
  • Have extensive knowledge in Data flow modeling and Object modeling, case analysis and functional decomposition analysis.
  • Deep Understanding and hands on experience on Data Sub setting, Profiling and cloning.

PROFESSIONAL EXPERIENCE

Confidential, Philadelphia, PA

Sr. Data Analyst

Responsibilities:

  • Worked with Data Analysis Primarily Identifying Data Sets, Source Data, Source Metadata, Data Definitions Data Formats, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.
  • Created various profiles using Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ), from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.
  • Created dashboards as part of Data Visualization using Tableau and Power BI.
  • Worked on data that was a combination of unstructured and structured data from multiple sources and automated the cleaning using SQL scripts.
  • Migrated all DTS packages to SQL Server Integration Services (SSIS) and modified the packages accordingly using the advanced features of SQL Server Integration Services.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality.
  • Designed a STAR schema for the detailed data marts and Plan data marts involving confirmed dimensions.
  • Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration.
  • Created reports from data warehouse using SSRS.
  • Designed and implemented multiple dashboards using Power BI - Power Pivot & Power Query tools for in house metrics.
  • Used SQL and SAS skills to perform ETL from DB2 and Teradata databases and created SAS datasets, SAS macros, Proc/data steps and SAS formats as required.
  • Migrated data from different sources (text-based files, Excel spreadsheets, and Access) to SQL Server databases using SQL Server Integration Services (SSIS).
  • Involved in Writing programs and SAS macros to assist other team members in resolving issues during conversion, and explained their use and functionality to the team.
  • Participated in all phases of data mining, data collection, data cleaning, developing models, validation and visualization.
  • Designed and developed MDM data model and integration of Web Sphere Process Server.
  • Created Page Level Filters, Report Level Filters, and Visual Level Filters in Power BI according to the requirements.
  • Involved the implementation of MDM concept with upstream and downstream integration of different applications in agile environment.
  • Created Tabular Matrix reports, Charts and graphs as per customer requirements Using SSRS.
  • Analyze the Pharmaceutical's data and business terms from a data quality and integrity perspective.
  • Involved in implementing a MDM process to take strategy to roadmap and design development activities. Delivered MDM roadmap.
  • Delivered Enterprise Data Governance, Data Analyst, Metadata, and ETL Informatica solution
  • Assists ETL team to define Source to Target Mappings.
  • Proven track record in troubleshooting of ETL mappings and addressing production issues like performance tuning and enhancement.
  • Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
  • Created a road map for the client for the planning, developing, and implementing of MDM solutions, enabling consolidation of MDM data following Mergers and Acquisitions.
  • Daily Data quality checks and Data profiling for accurate and better reporting and analysis.
  • Involved in understanding the customer needs with regards to data, documenting requirements and complex SQL statements to extract the data and packaging data for delivery to customers.
  • Wrote complex SQL and PL/SQL queries to identify granularity issues and relationships between data sets and created recommended solutions based on analysis of the query results.
  • Performed unit testing on transformation rules to ensure data moved correctly.
  • Writes various SQL statements for data generation for purpose of analysis.
  • Wrote the SQL and PL/SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases, producing reports and updating spreadsheet information.
  • Researched and fixed data issues pointed out by QA team during regression tests.
  • Writes SQL Stored Procedures and Views and perform in-depth testing of new and existing systems.
  • Manipulate and prepare data, extract data from database for business analyst using Tableau.
  • Review normalized schemas for effective and optimum performance tuning queries and data validations in OLTP and OLAP environments.
  • Exploits power of MS SQL to solve complex business problems by data analysis on a large set of data.
  • Working knowledge on different data sources as Flat files, Oracle, SQL Server, RDBMS, Teradata and have knowledge in Data extraction, profiling, identifying data quality issues.

Confidential, Dublin, OH

Data Analyst

Responsibilities:

  • Created SSIS Packages using Various Transformation like Lookup, Derived Columns, Condition Split, Data Conversion, Aggregate, Merge Join, Sort, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file and MS Access to data warehouse.
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Created Adhoc reports to users in Tableau by connecting various data sources.
  • Created/Reviewed data flow diagram to illustrate where data originates and how data flows within the Enterprise Data warehouse (EDW).
  • Created SSIS package to load data from Flat File to SQL Server using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term Extraction, Aggregate, Pivot Transformation, Slowly Changing Dimension.
  • Created different visualization (Stacked bar Chart, Clustered bar Chart, Scatter Chart, Pie Chart, Donut Chart, Line & Clustered Column Chart, Map, Slicer, Time Brush etc.) in Power BI according to the requirements.
  • Developed dashboards and ad-hoc reports using MS Power BI and SSRS for senior management team for analysis.
  • Involved in Developing SAS macros for data cleaning, reporting and to support routing processing.
  • Used MDM tool to support Master Data Management by removing duplicates, standardizing data (Mass Maintaining), and incorporating rules to eliminate incorrect data and filter data as per requirements.
  • Provided continued maintenance and development of bug fixes for the existing and new Power BI Reports.
  • Developed the required data warehouse model using Star schema for the generalized model.
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
  • In charge of comprehensive Data Quality by making sure invalid, inconsistent or missing Obligor Risk Ratings are reported to portfolio managers for remediation and ensure that checks are in place to prevent the issue from re-occurring.
  • Managed functional requirements for interfaces and conversions between other legacy systems to Teradata, MDM, Enhancements, Workflows and Reports for MDM.
  • Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Database table review and data mapping for large scale data conversion project Oracle database to Mainframe.
  • Wrote SQL queries for each Test case and executed in SQL Plus to validate the data between Enterprise Data Warehousing and Data Mart Staging Tables.
  • Created Data Flow Diagrams and Process Flow Diagrams for various load components like FTP Load, SQL Loader Load, ETL process and various other processes that required transformation.
  • Validated the test data in DB2 tables on Mainframes and on Teradata using SQL queries.
  • Transformed project data requirements into project data models.
  • Wrote Test cases for Enterprise Data Warehousing (EDW) Tables and Data Mart Staging Tables.
  • Worked on Performance, Tuning and loading data for fast access of reports in Client/Database. Server balancing, business Rules Implementation, Metadata, Data Profiling.
  • Produced functional decomposition and logical models into an enterprise data model.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Used SAS ODS to create output files in different formats including PDF and RTF.
  • Collected business requirements to set rules for proper data transfer from Data Source to Data Target in Data Mapping.

Confidential, Las Vegas, NV

Data Analyst

Responsibilities:

  • Created various data quality mappings in Informatica Data Quality (IDQ) tool and imported them into Informatica PowerCenter as mappings, Mapplets.
  • Extensively used Informatica Data Quality transformations - Labeler, Parser, Standardizer, Match, Association, Consolidation, Merge, Address Validator, Case Converter, and Classifier.
  • Evaluate data models / reports in the ETL process for quality and accuracy and make well-reasoned recommendations backed up by strong data visualizations to support the business team.
  • Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases.
  • Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
  • Transformations of requirements into data structures, which can be used to efficiently store, manipulate and retrieve information.
  • Created new reports using SQL Server Reporting Services SSRS.
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Evaluation and selection of various MDM technologies and products in view of the client requirements.
  • Used Teradata utilities fast load, multi load to load data.
  • Wrote, tested and implemented Teradata Fast load, Multi load and Bteq scripts, DML and DDL.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Define downstream targets for MDM with design for attributes and sync methodologies.
  • Created Complex Teradata scripts to generate ad-hoc reports that supported and monitored Day to Day.
  • Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata utilities such as MLOAD, BTEQ and Fast Load.
  • Involved in Building a specific data-mart as part of a Business Objects Universe, which replaced the existing system of reporting that was based on exporting data sets from Teradata to Excel spreadsheets.
  • Migrated three critical reporting systems to Business Objects and Web Intelligence on a Teradata platform.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Used data analysis techniques to validate business rules and identify low quality for Missing data in the existing Pharmaceutical data warehouse EDW.
  • Also worked on some impact of low quality and/or missing data on the performance of data warehouse client.
  • Identified design flaws in the data warehouse.
  • Tested raw data and executed performance scripts.
  • Performed Data Analysis and Data validation by writing complex SQL queries using TOAD against the ORACLE database.

Confidential, Dallas, TX

Data Analyst

Responsibilities:

  • Creating and maintain data model/architecture standards, including master data management (MDM).
  • Extensively tested the ETL mappings which were developed to load data from Oracle and SQL Server sources into the Oracle Staging/Target Area.
  • Verified and validated data model structure and E-R modeling with all related entities and relationship with each entity based on the rules using Erwin as per the business requirements.
  • Worked on development, implementation, administration and support of ETL processes for large-scale Data Warehouses using SSIS and DTS.
  • Worked in the implementation of loan and risk management system applications.
  • Used MDM tool to support Master Data Management by removing duplicates, standardizing data (Mass Maintaining), and incorporating rules to eliminate incorrect data and filter data as per requirements.
  • Used SQL Queries to verify the data from the Oracle and MS SQL Server databases
  • Performed backend testing using SQL queries and analyzed the server performance on UNIX.
  • Tested mappings and SQL queries in transformations such as Expression transformation, Filter transformation, Lookup transformation, Joiner transformation, XML transformation, Aggregator transformation, and Update strategy transformation.
  • Creating TSQL Scripts to include them in the development of SSRS and Crystal Reports.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Responsible for management of data, duplicating the data and storing the data in to specific data warehouse using Talend Platform for Data Management and MDM.
  • Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
  • Extensively used Informatica Debugger to validate maps and to gain troubleshooting information about data and error conditions.
  • Interacted with client user personnel to ensure continuing adherence to requirements and user standards.

Confidential, Fort Lauderdale, FL

Data Analyst

Responsibilities:

  • Creating Dashboards and visualization of different types for analysis, monitoring, management and better understanding of the business performance metrics.
  • Involved in Developing and modifying ETL and MDM using Informatica Developer.
  • Involved in (Master Data Management) MDM to help the organization with strategic decision making and
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Involved in Creating Dashboards Scorecards, views, pivot tables, charts, for further data analysis.
  • Define and communicate project scope, milestones/deliverables and projected requirements from clients.
  • Use Tableau for SQL queried data, and data analysis, generating reports, graphics and statistical analysis.
  • Queried questions up front to determine key issues and foresee potential show-stoppers that might arise later and in-depth understanding of OTC derivatives operations.
  • Involved in Deviling Reports using the SQL advanced techniques like Rank, Row number etc.
  • Analyze data using Tableau for automation and determine business data trends.
  • Involved in Proving guidance for transitioning from Access to SQL Server.
  • Transfer data objects and queries from MS Excel to SQL Server.

We'd love your feedback!