We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

3.00/5 (Submit Your Rating)

Franklin Lakes, NJ

SUMMARY

  • Over all 7+ years of experience as a Data Analyst in Pharma Industry.
  • Solid understanding of Business Requirements Gathering, Business Intelligence, Data warehousing, Data Mapping, Data Validation, Data Integrity, Master Data Management (MDM), Reporting, Data Governance, Data Stewardship and Data Modeling.
  • Extensive experience in conducting Market Research, Feasibility Studies, Data Analysis, Data Mapping, Data Profiling, Gap Analyses, Risk Identification, Risk Assessment, Risks Analyses, and Risk management.
  • Experience in ETL tool using info Informatica (PowerCenter Designer, Repository Manager, workflow Manager, workflow Monitor).
  • Extensively involved in MDM to help the organization with strategic decision making and process improvements. (Streamline data sharing among personnel and departments).
  • Assisted in Clinical Data Management and Clinical Trials Management.
  • Experience as a Data Analyst in Pharmaceutical and Healthcare domain and experience as a passionate business, process, and qualitative analyst with expertise in understanding business problems.
  • Comprehensive knowledge of software development methodologies such as RAD, RUP and AGILE SCRUM.
  • Experience in data extracting (ETL), MIS Reporting, data Conversion and manipulation, transformation, data cleansing and suppressions, data validation & error trapping, testing, and loading large data sets into SAS from various applications/databases (Oracle, Teradata, DB2, SQL server, Access, spread sheet, text files) and environments (windows, UNIX, TSO/MVS Mainframe).
  • Experienced in designing complex reports utilizing SSRS.
  • Worked heavily onClinical Trials using IVRto conduct global clinical trials and manage the large volumes of data generated.
  • Experience in Data Lineage using Excel and Collibra.
  • In depth knowledge Rational Unified Process (RUP) methodology, Use Cases, Software Development Life Cycle (SDLC) processes, Object Oriented Analysis and Design (OOA/D).
  • Experienced in defining the queries for the reports using SSRS.
  • Experience utilizing SQL Server Reporting Services (SSRS), Cognos Impromptu, and Microsoft Excel.
  • Extensive experience in Data Market Systems,BPMand SQL queries.
  • Experience with System Life Cycle (SLC) and Medidata RAVE application.
  • Demonstrated ability to adapt into new environments, learn new skills and participate/lead teams.
  • Acquainted with SQL, Quality Center, Quick Test Pro, Win Runner, and load Runner.
  • Expertise in broad range of technologies, including business process tools such as Microsoft Project, Primavera, Promodel, MS Excel, MS Access, MS Visio, technical assessment tools, Data Warehousing concepts and web design and development.
  • Highly motivated team player with excellent Interpersonal and Customer Relational Skills, Proven Communication, Organizational, Analytical, Presentation Skills, and Leadership Qualities.

PROFESSIONAL EXPERIENCE

Confidential - Franklin Lakes, NJ

Sr. Data Analyst

Responsibilities:

  • Assessed the current system documentation for the Clinical Research Collaborations CRC System and provide recommendations for consolidating and updating the existing system documentation.
  • Implemented the project using the Agile Methodology to produce artifacts in the different phases of the Software Development Life Cycle (SDLC).
  • Effectively participate in clinical research activities including design, consent, data gathering and data retrieval.
  • Regular interaction with the core developers helped us in fixing the defects in less time.
  • Created SAS programs that are used for data validation, statistical report generation and program validation and automated the Edit Check programs using Macros.
  • Participated in continuous improvement projects for a Medical and Pharmaceutical Delivery Contract Manufacturing Organization.
  • Created logical and physicaldatamodeling with STAR and SNOWFLAKE schema techniques using Erwin inDatawarehouse as well as inDataMart.
  • Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server-SSRS) to match the results with the actual report against the Data mart (Oracle).
  • Performed data analysis on source data to be transferred into an MDM structure.
  • Involved in Writing and reviewing Installation Qualification (IQ), Operational Qualification (OQ), Performance Qualification (PQ) protocols for the LIMS application.
  • Participate in the analysis of the User Requirements and Functional Requirements for Agile PLM and LIMS.
  • Validate data integration between interface and LIMS as per the generated IQ, OQ and PQ documents.
  • Worked on developing several detail and summary reports including line and pie charts, trend analysis reports and sub-reports according to business requirements using SQL Server Reporting Services (SSRS).
  • Researched data issues and identify the root cause in source systems, data warehouse and data marts using various data analysis techniques, data profiling and data mining activities.
  • Used Master Data Management (MDM) technology, tools to create and maintain consistent and accurate lists of Master Data.
  • Involved in MDM Process including data modeling, ETL process, and prepared data mapping documents based on graph requirements.
  • Worked on data warehousing projects dealing with ETL (extraction, transformation, loading) using various databases.
  • Evaluate all SQL applications and prepare layouts for all logical models and maintain database objects for various application components
  • Performed walkthroughs for Conceptual Data Model, Logical Data Model and Physical Data Model for validating designs for the enterprise data warehouse.
  • Involved in writing SQL queries to extract related data from transaction record database for OLAP purposes.
  • Responsible for managing all data management activities such as requirement gathering, designing specification documents, eCRF designing, study set-up, edit checks, database, user acceptance testing, views deployment, quality checks required to support clinical trial process handling application issues.
  • Involved in designing complex reports utilizing SSRS.
  • Worked on BusinessProcess Management (BPM), Use Case modelling using UML, and Data Modelling.
  • Used Teradata utilities Fast Load, Multi Load, tpump to load data.
  • Involved in designing and developing data management application on with central operational database.
  • Worked in identifying the efficient Design flow for migrating from SQL Server to Oracle Exadata Platform.
  • Conducted data driven testing using QTP to conduct backend testing.

Confidential, Groton, CT

Data Analyst

Responsibilities:

  • Assuring that all validation documentation such as Process Validation and Software Validation follow GMP's.
  • Worked on Data mapping, logical data modeling used SQL queries to filter data within the Oracle database tables.
  • Involved in exhaustive documentation for technical phase of the project and training materials for all data management functions.
  • Involved in designing SAS Visual Analytics (VA) reports using VA Data builder, VA Explorer, VA Designer, VA Viewer and creating custom reports in SAS VA.
  • Worked on Metadata Management, Data Analysis, Data Integrity, Data Acquisition, Data Steward, Data Mining, Data Profiling & Quality, Data Governance and Master Data management (MDM).
  • Performed physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snowflake, FACT, and Dimensions), Entities, Attributes, OLAP, OLTP, Cardinality, and ER Diagrams.
  • Worked closely with reporting team for deploying Tableau reports and publishing them on the Tableau and Share point server.
  • Worked on Generating data element matrix encompassing all the data element report wise including the data mapping for the data warehouse.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Worked on manipulating the data and modified the code wherever necessary using SAS functions and Proc SQL statements, formats, informats using SAS EG.
  • Involved in working on Salesforce, Veeva, Master Data Management (MDM), Customer Data Integration, Roster database upgrade and other significant Siebel related projects.
  • Worked on implementing Collibra to automate data management processes.
  • Involved in utilizing SQL Server Reporting Services (SSRS), Cognos Impromptu, and Microsoft Excel.
  • Identified the database tables for defining the queries for the reports using SSRS.
  • Successfully conducted JAD sessions, which helped synchronize the different stakeholders on their objectives and helped the developers to have a clear-cut picture of the project.
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Involved in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center.
  • Involved in implementing Master Data Management (MDM) for all entities in the project by maintaining a master hub for improving the data quality by following various data governance techniques and data dictionary standards.
  • Worked on researching, analyzing, reconciling balances, process mapping, developing policies and procedures to improve tracking and issuing of Watershed material, preparing MS Excel, and MS Word Reports for Senior Management.
  • Responsible for managing process of recycling and disposal of used material and obsolete.
  • Interacted with users for verifying User Requirements, managing Change Control Process, updating existing Documentation.
  • Verified and maintained Data Quality, Integrity, data completeness, ETL rules, business logic.
  • Tested several UNIX Korn Shell scripts for ETL data loading.
  • Worked on Documentum for Version Controlling, to maintain up to date changes in the Documents.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Used Data Loader for insert, update, upsert, and bulk import or export of data from Salesforce Objects. Used it to read, extract and load data from comma separated value (CSV) files.

Confidential - Little Rock, AR

Data Analyst

Responsibilities:

  • Involved in working with clinical trial data like demographic, AE, SAE, Lab, Vital Signs and CRF/e-CRF, other essential documents in clinical Research.
  • Worked on Clinical Data Analysis in accordance with guidelines and providing Clinical Study Reports.
  • Involved in Designing Star Schema, Creating Fact tables, Dimension tables and defining the relationship between them.
  • Identify source systems, connectivity, tables, and fields, ensure data suitability for mapping.
  • Responsible for reviewingdatamodel, database physical design, ETL design, and Presentation layer design.
  • Worked with end users for BI data analysis and data validation and also responsible for Master Data Management (MDM) and Data Life Cycle process.
  • Involved in understanding the customer needs with regards to data, documenting requirements, developing complex SQL statements to extract the data and packaging/encrypting data for delivery to customers.
  • Worked on FACETS batches like XPF and MMS to upload bulk data into the FACETS system through the HIPAA gateway by generating keyword files to enroll, modify or terminate providers and members Analysis of the data using SAS retrieving data MS Excel Sheets.
  • Initiated detailed discussions/functional walkthroughs with stakeholders
  • Responsible for redirecting the destination of user reports from the knowledge link to SharePoint
  • Actively participated in producing system documents, functional requirements, ETL mapping and modeling deliverables Worked with root cause of data inaccuracies and improved the quality of data by using Data Flux
  • Performed Collibra Workflow development and configuration based on MS Data Governance Approach & Requirements.
  • Involved in resolving data integrity and quality exercises to reconcile data from Master Data Management (MDM).
  • Created various transformation procedures by using SAS, ETL and SAS Enterprise guide.
  • Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Conducted process mapping and process pinch points during a Kaizen activity to reduce tool and manpower waste.
  • Testing the ETL data movement from Oracle Data mart to Netezza Data mart on an Incremental and full load basis.
  • Involved in developingData Mapping, Data Governance, TransformationandCleansingrules for the Master Data Management Architecture involving OLTP, ODS and OLAP.

Confidential, Atlanta GA

Data Analyst

Responsibilities:

  • Worked on Mapping of data for the early-stage development clinical trials study through an application named Trial Setup.
  • Involved in multiple project teams of technical professionals through all phases of the SDLC using technologies including Oracle, Erwin,DataStage,DataWarehousing,Websphereand Cognos.
  • Worked on the ETL mappings, analysis, and documentation of OLAP reports requirements.
  • Worked on developing data models, data dictionaries, CRUD matrix in SAP HANA, database schemas like star schema and snowflake schema used in dimensional and multidimensional databases OLAP .
  • Involved in implementing Collibra to automate data management processes.
  • Involved in working in the area data management including Data Modeling, Metadata, Data Analysis, Data Integrity, Data Mapping and Data Dictionaries.
  • Worked on using Shared Containers and creating reusable components for local and shared use in the ETL process.
  • Wrote SQL Queries in MS Access to sort data and analyze the large set of data during project life cycle.
  • UsedInformaticaas an ETL tool to create source/target definitions, mappings, and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizingRouter, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Assisted ETL team to define Source to Target Mappings.
  • Participated in Writing SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.
  • Tested whether the reports developed in Business Objects and Crystal Report are as per company standards.
  • Worked on Informatica Power Center tool -Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Used SDLC (SystemDevelopment life Cycle) methodology like Agile and Scrum.

We'd love your feedback!