We provide IT Staff Augmentation Services!

Data Analyst And Data Lab Admin Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 14 years of IT experience in analysis, design, development, and implementation of a variety of Data Warehouse applications, utilizing full project life cycle approach.
  • Collaborates with customer and delivery team to analyze and identify necessary data sources, ways of data extraction and integration opportunities with focus on real time data collection.
  • Collect data to analyze, understand, interpret and process together with development team to be used in solution data model with high level focus on data quality and customer requirements.
  • Strong working experience on Teradata utilities (SQL, BTEQ, FastLoad, MultiLoad, FastExport, Teradata Parallel Transporter (TPT)), analytical tools like BASE SAS.
  • Programming experience using SQL, PL/SQL (Stored Procedures, Functions and Triggers), Oracle, VBScript, UNIX Shell Programming and R programming languages.
  • Hands on experience in designing and developing of Extract transfer and Load (ETL) processes using ETL tools like Informatics and Ab - initio.
  • Experience in developing the dashboards and Adhoc reports using Tableau and power BI tools.
  • Hands on experience in Microsoft Azure and Azure data bricks in pharma domain.
  • Provided 24x7 production support, troubleshooting production issues, monitoring, performance tuning and maintenance for Oracle D2K (Forms and Reports) applications while working in automotive domain.
  • Proficient in maintaining Teradata data labs, providing required level of access to the business users and performance monitoring through Teradata viewpoint tool and SQL Query Tuning.
  • Coded well-tuned SQL and UNIX Shell scripts for high volume data warehouse instances.
  • Developed scripts for build, deployment, maintenance and related tasks using VB scripts, Oracle SQL and bash.
  • Proficiency in SQL, DBMS Languages and Data Profiling. Proficient in interpret data, analyze results and provide reports.
  • Experience in evaluating business requirements and transform client requests into data deliverables.
  • Experience in maintaining data artifacts like product specifications and technical documentation.
  • Experience in strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
  • Experience in working with third party vendors like ZS Associates, IQVIA, Symphony health and Boston Consulting Group (BCG) who are very niche in Pharma commercial insights space.
  • Certified Scrum master, Teradata and Azure fundamentals

TECHNICAL SKILLS

Operating Systems: Windows, Sun Solaris, Linux

Databases: Oracle 9i/10g/llg, Teradata, My SQL

ETL tools: Ab initio, Informatica

Reporting tools: Tableau, Power BI, Cognos

Monitoring Tools: Autosys, Teradata view point

Scripting Languages: SQL, VB script, Shell, R, SAS, PL/SQL, D2K(Forms and Reports)

Cloud services: Microsoft Azure, Azure data bricks

PROFESSIONAL EXPERIENCE

Confidential

Data Analyst and Data Lab Admin

Responsibilities:

  • Maintain and monitor the Teradata data lab environment using the Teradata viewpoint tool.
  • Monitoring the skew factor and permanent space occupied for each object created by the business in each Teradata data lab.
  • Responsible to increase /decrease the data lab space according to the requirement.
  • Responsible to maintain the “Teradata Viewpoint” tool after its migration to 15.10 which is used to effectively to manage the data labs.
  • Design Teradata objects (Tables/views) in the data labs according to the Adhoc request raised by the business users.
  • Responsible to import the data into Teradata database using Teradata tools and utilities like FastLoad, MultiLoad and Teradata Parallel Transporter (TPT).
  • Responsible to export the analytical data using TPT quick start scripts which will allow us to export 1 million rows in 90 seconds.
  • Responsible to extract huge patient level analytical ready data sets from different sources using business filters.
  • Responsible to communicate with different pharma vendors about the third-party agreements and share the data securely using secure FTP connection.
  • Automated all the sources using UNIX shell scripting and CRON TAB feature.
  • Extensive experience in Teradata code Development.
  • Communicate to different vendors and guide them with the data availbility in data warehouse for the analysis like patient journey, cohort analysis, Compliance and persistence analysis, pear analysis and Mix model analysis.
  • Convert existing SAS code into Teradata code which is efficient (fast), automated and maintainable
  • Experience with performance tuning, troubleshooting, and automation of existing processes.
  • Experience in extracting, transforming and loading source data into the Teradata environment from multiple Sources
  • Responsible to generate Adhoc reports using reporting tools or SQL interface tools as requested by business user.
  • Automate all the deliverables using SAS/Teradata scripts which will help the business users to perform their regular advance analytics smoothly.
  • Coordinate with different teams to identify and troubleshoot the data issues in applications
  • Preparing application understanding documents and knowledge artifacts to ensure the system documentation is available with precise information.
  • Coordinating offshore team and customer to run the services smooth and flawless.
  • Suggesting service improvements and implementing to improve the service.
  • Handling Ad-hoc client requests and delivering the reports within SLA.
  • Supported the commercial Insights team in identifying the right data that should migrate from Teradata to Microsoft Azure environment.
  • Extracting and storing the huge APLD data sets into BLOB containers in Azure.
  • Getting the requirements from Client Side and transforming the work to Offshore Team.
  • Involving in the business meetings with Business Analyst and converting the requirements.
  • Helping the Team in understanding the requirements and executing the projects within the SLA’s.
  • To make sure that all the phases of SDLC i.e., Development, Validation & Production Support Activities going well.
  • Implementing the best practices in Warehousing and Intelligence applications and databases, including the design and creating of objects such as tables, diagnosis and repairing user and performance issues, as well as testing, monitoring and tuning application queries.
  • Developing the code for various change requests, resolving the defects raised by the QA team and resolving the production issues.
  • Conduct daily stand up meetings with the team.
  • Attend project health update meetings with the client.
  • Collaborate with third parties and suppliers across a wide range of geographies on new areas of work.

Environment: Teradata Studio express, Teradata Sql assistant, Teradata tools and utilities (TTU), Teradata view point tool, BASE SAS, R, UNIX, SUN SOLARIS, Shell scripting, FileZilla, putty, MySQL workbench, Microsoft Azure and Azure data bricks

Confidential

Data Analyst

Responsibilities:

  • Responsible to support Default loss and recovery (D&R) application in Commercial Finance.
  • Responsible for working with different Businesses and integrating the individual business data into the EDW.
  • Communicate with business users to understand the technical requirements and design the new constructs required and Map the Source to Target Data elements.
  • Analyze the Informatica mappings and Teradata bteq scripts to identify the exact piece of code for enhancement.
  • Help the ETL teams understand them appropriately and guide them arrive at a better design.
  • Review the requirement documents, design documents and test results to ensure that the teams had arrived at a proper solution.
  • Work with business to make sure that the UAT is performed considering all the aspects of the project.
  • Help the team in preparing the build process to migrate solutions into production.
  • Incorporating the entire SQL into the build release using volatile tables in Teradata.
  • Support the Teradata DBA team to make sure the data changes applied across all the business space wherever it is requested.
  • Data profiling and Data sampling as requested by the business.
  • Scheduling the SNAP process which moves data from CORE tables to ERISK tables.
  • Worked with SQL Override in the Source Qualifier and Lookup transformation.
  • Extensively used various active transformations like Filter Transformation, Router Transformation, Joiner Transformation and Aggregator Transformation.
  • Extensively worked with various passive transformations like Expression Transformation and Lookup Transformation.
  • Help the teams in resolving postproduction issues if any.
  • Comparing the requirement with existing structure of data and existing data-feed files received by data management;
  • Understanding the requirement from database perspective, data modeling, designing new database instance / schema / target tables and mapping of incoming feed files to database objects.
  • Responsible in loading of various tables into the EDW after identifying how new tables and columns can be incorporated into the EDW.
  • Responsible to handle Adhoc requests.
  • Conduct daily standup meetings with the team
  • Attend project health update meetings with the client.
  • Provide monthly projects status reports.
  • Give presentations on the proposed solutions for new areas of identified work.
  • Maintain client relationship to build confidence with the project team.
  • Showcase customer service improvements and best practices followed.
  • Work on new proposals in the current line of business for the client.
  • Leading and motivating the team members.

Environment: Teradata SQL assistant, Informatica, BTEQ scripts, UNIX, SUN SOLARIS, Shell scripting, FileZilla, putty

Confidential

Data Analyst

Responsibilities:

  • Support customer profitability dashboard and drill thru reports developed in COGNOS reporting tool which captures all the portfolio performance of the reporting managers (RM).
  • Understanding the business requirement and prepare Technical design document (TDD) & functional design document (FDD).
  • Responsible to coordinate with ETL development team, reporting team and data analyst teams during each phase of project life cycle.
  • Act as point of contact between technology team and business managers/users
  • Facilitate the team meetings to discuss on requirements and propose possible solutions
  • Evaluate alternative solutions and arrive best suitable solution to meet the requirements
  • Validate the implementation and sign-off to implement across the environments.
  • Developed re-usable excel macros to validate data across different databases and report mismatches.
  • Developed re-usable excel macros to validate database objects across different databases
  • Developed re-usable excel macros to report the data quality for given database tables
  • Responding to all Adhoc requests raised by the customer
  • Responsible to validate the navigation of entire RM dashboard and their drill thru reports after migration from oracle to Teradata.
  • Providing regular updates to clients and explaining them the current status/plan of the migration project.
  • Coordinating with development team and reporting team to understand and resolve the challenges they are facing in the migration technically.
  • Responsible to validate the data from Oracle database to Teradata database at each level of production run.
  • Responsible to check the connectivity of all dependent systems like KPI, Customer hub from our profitability system after migrating the database to Teradata.

Environment: Teradata Sql assistant, Ab-initio, TPT scripts, BTEQ scripts, COGNOS

We'd love your feedback!