Data Analyst/etl Developer Resume
Charlotte, NC
PROFESSIONAL SUMMARY:
- Over 6 years of experience in the Information Technology (IT) Industry serving as a Data Quality Analyst/SQL Developer in a Data warehousing environment.
- Maintains Corporate Treasury Data Management’s Data Quality Rules, Measurement and Controls Standard related efforts to comply with enterprise requirements.
- Completes data profiling for all new data sets.
- Expert in writing complex case statements, In depth knowledge of Microsoft Excel (pivot tables, vlookups, hlookups, macro), Access, PowerPoint, Visio.
- Provides initial validation, testing, and post - production validation for all implemented DQ rules and Manages relationship with Data Stewards by product as DQ Rules are being implemented.
- Wrote complex SQL queries and Data Quality requirements documents to manage Data Quality rules.
- Experience working with relational databases (Netezza, Oracle, Toad, etc.) and software’s (Tableau, Excel) to generate data quality reports, and query Oracle database to identify and document controls.
- Worked with Metadata team to update data flow mapping Visio and Lineages for Corporate Treasury applications.
- Have knowledge and prior experience in the Corporate Treasury functions, general ledger, trading products and regulatory reporting.
- Created and maintained processes and controls for Corporate Treasury and their business partners.
- Experience in writing Stored Procedures using all kinds of Joins, Control of Flow, Cursors, Correlated Sub queries and XML techniques.
- Strong skills in resolving Teradata issues, providing workaround for the problems, knowledge on diagnostics, Database Tuning, SQL Tuning and Performance Monitoring.
- Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Perform analysis, process flow and use case development for integration of reporting data between two existing reporting platforms.
- Experience in SAS/BASE, SAS/MACRO, and SAS/ODS, SAS/SQL, SAS/STAT, SAS/GRAPH, SAS/ACCESS, and SAS/CONNECT in UNIX and Windows environments.
- Experienced in building various types of reports with SSRS through drill down, drill through, Parameters, Sub Reports, Table, Matrix, Chart and Textbox.
- Solid knowledge of data warehouse, like OLAP, OLTP, Star Schema and Snowflake Schema.
- Hands on experience in developing XML Schemas, elements and attributes.
- Skills include Data Analysis, collection and reporting tools using Big Data mining techniques and frameworks.
- Supporting SAP systems running on LINUX, UNIX & Windows based servers across 5 Data Centers.
- Attending required meetings, planning and executing work, providing feedback on requirements, reporting metrics, tracking and reporting progress and creating/submitting required documentation or evidence.
- Performed data extraction, transformation and loading using SSIS packages.
- Created Use Case Diagrams, Activity Diagrams, Sequence Diagrams, Dataflow Diagrams in MS Visio.
- Familiar with SAS Data Flux Data Management Studios. Worked closely with SAS developers to automate DQ rules. Performed UAT and Post Production validation testing on software deployments.
- Supported with creating and compiling documentation and evidence of compliance.
- Documented and escalated issues as needed. Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Experience in Teradata Enterprise Data Warehouse (EDW) and Data Mart. Experience in Designing and implementing data structures and commonly used data business intelligence tools for data analysis.
- As a data analyst details and deadline oriented with strong organizational and analytical skills.
TECHNICAL SKILLS:
Web Technologies: CSS, XML, HTML/HTML5, Schema, Java Script.
Operating Systems: MS windows X/8/10, Unix, Linux.
Database: MS SQL Server, Oracle, MS- Access, Netezza, Toad, Azure
SQL Server Tools: SSMS, BIDS, SQL Server Profiler, Performance Monitor
Programming Languages: C, C++, T-SQL, SQL, Python, VBA, PL/SQL, Teradata
Processes and Architecture: : Agile-Scrum and XP, SDLC, OOAD, UML
Web Technologies: : XML, WEB SERVICES, WCF
Microsoft Technologies: SQL, SSIS, SSRS, SSAS, Visual Studio 2010, Power BI
ETL Tools: Informatica power center, Ab Initio Express>IT
PROFESSIONAL EXPERIENCE:
Confidential - Charlotte, NC
Data Analyst/ETL Developer
- Working with business and technical application owner to write rules requirements on data at rest and obtain control catalogs/metrics around front-end and movement controls.
- Working with the eDQM Profile and Rule team on implementing rule requirements in the Ab Initio Tool
- Developing End2End Testing and working with the business application owners on remediating the data.
- Developing data quality rules, including manual and system control processes
- Performing data defect analysis and remediation
- Working with multiple database platforms such as Oracle, SQL Server, Teradata, DB2, Hadoop and Hive
- Producing data artifacts evidence to demonstrate adherence working on data models and data supply chain.
- Creating and maintaining data quality dashboards or metrics and developing complex SQL queries for multiple applications and configurations.
Environment: Ab Initio Express>IT, Metadata Hub, Visio, MS Word, Power Point and Excel, SQL.
Confidential - Cincinnati, OH
Data Migration Specialist
- Performed testing in data migration and executed and verified ETL jobs (DataStage).
- Performed data loading after extracting source data in yml files into database.
- Planning and Managing end-to-end data migrations and conversions from MYSQL, CSV files, MS SQL Server, MS Access etc.
- Data loading is done from load files to SRC files in target database.
- Developed and modified SQL queries for data validation and data verification on database.
- Worked on data profiling and checked for discrepancies in counts on different applications.
- Created, updated and truncated tables with each iteration id for every release.
- Performed testing for n number of company ids in eclipse.
- Performed data transformation and data cleansing procedures to prepare legacy system data for successful import into the new system.
- Perform dataset testing on code script and screen to screen validation.
- Validated data from multiple sources like files, XMLs and databases.
- Auto execution controller was set up for testing with different Iteration Ids.
- Imported load files and SRC files from one source to another source.
- Worked on Wire and ACH Templates with different company Ids.
- Executed scripts for customer entitlements and account entitlements.
- Performed ETL processes and relational database design methods.
- Analyzed and reported data after successful executions.
Environment: Oracle, SQL Server, IBM DB2, Teradata, MySQL, Eclipse, METL
Confidential - Charlotte, NC
Data Quality Analyst
- Testing (Quality Assurance) lead for Asset Traceability Product and a member of Treasury Data Management Team.
- Monitored the quality of data for governance, downstream regulatory reporting and operational effectiveness .
- DQR involves the validation of data in the various applications for completeness and correctness as per the Data Quality Rules provided by the Business.
- Data is tested through SQL queries executed on multiple platforms including Oracle SQL Server, Netezza and Toad.
- Data Profiling Tool is used to Profile Data and to Complete the rule development Process. Rules are generally written on Product’s Table or View.
- Understanding the business of the company and to Support management with ad hoc reporting efforts as needed.
- Wrote appropriate SQL Queries to interact with the Oracle Database.
- Performing additional Checks for dependencies and notifying for review after completing the M&C File. Creating Business Rules and building SQL logic to generate Data Flux Rules and Jobs.
- Contacting DQ Tech Team for UAT and validating all the data. RDM update is done if applicable.
- After releasing all rules into the Production performing PPV to test DQ Tech Implementation.
- Creating Tickets in Portal to the respective Products and Rules. Maintaining and Updating all the Tickets and DQR status reporting File.
- Monitoring the performance of the database and checked on the execution time. Maintain various deadline specific daily reporting efforts.
- Analyzed and worked with various macros and java files used in the Script.
- Writing complex SQL queries to validate the data based on the Mapping Requirements and data Quality Rules. Validate the Data Quality Reports in the customized Dashboards.
- It also involves impeccable documentation skills with ability to work independently and be self-motivated.
- Responsible for completing daily activities of regulatory reporting for the Banking Department within the Main organization.
- Monitoring the Passed percentage and Failed Percentage of DQ Rules when threshold is breached, and the appropriate product Data Steward is then notified as part of the daily reporting process.
- Submitted different ARM requests to get the access to Databases and connections.
- Assist with the preparation of timely and accurate financial reports to regulatory agencies, associations, and internal departments
- Researches data and DQ rule issues, as needed. Analyzing and interpreting complex data on all target systems. Resolving data mart issues by performing preventive actions.
- Used web-based tool Info Quality (IQ) that includes Data Quality module which is powered by CIDW.
- Utilized Spark SQL to event enrichment and used Spark SQL to prepare various levels of user behavior summaries.
- Validating the Data Mart using SQL queries and communicating to the ETL developers about the data requirement. Going through the Requirements and raising for clarifications.
- Analyze data, pull reports and manipulate the data focused on fulfilling the end user’s requirements; utilize Micro strategy to export the report to excel using pivot tables etc.
- Worked with different teams like DQ Tech, Data Stewards, Info Quality, DQC and LIBRA Team.
Environment: Oracle SQL Server, Netezza, JavaScript, TOAD, Exadata, MS PowerPoint, MS Outlook, MS Excel, MySQL, Data Flux, Tableau, VBA, Python, Spark, Data Flux Web studio, IQ.
Confidential - Dallas, TX
SQL Developer
- Worked specifically with client gatherings, administration and other colleagues to investigate and assess asks for and compose point by point portrayal of client needs. Understand the business of the company.
- Used Rational Rose for the use case Diagrams, class diagrams, and sequence diagrams to represent the detailed design phase.
- Massively involved in Data Modeler role to review business requirement and compose source to target data mapping documents.
- Designed and Developed logical & physical data models and Meta Data to support the requirements.
- Responsible for cleansing and normalization of data for employee efficiency and departmental profitability analysis project utilizing MS Access, MS Excel (includes Pivot Tables, VLOOKUP) creating custom User Defined Functions, VBA and SQL .
- Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL.
- Introduced and implemented Agile Methodologies such as Scrum, Extreme Programming (XP) and Test-Driven Development (TDD).
- Created SQL script to create/drop database objects, tables, views, indexes, constraints, sequences and synonyms. Involved in SQL performance tuning.
- Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality. Involved in unit testing and integration testing of the Application.
- Used Bootstrap to transform standard HTML websites into single page design. Used Web Services (SOAP, WSDL) for communicating with other application and components.
- Wrote appropriate SQL Queries to interact with the Oracle Database.
- Monitored the performance of the database and checked on the execution time.
Environment: Java EE, JavaScript, Teradata, CSS, HTML, SQL Server 2008/2012. Developer Express Tools, JavaScript, XML .
Confidential
SQL/ETL Developer
- Understanding the requirement of data mart and reporting.
- Compile the reporting requirement and providing the systems analysis to determine the correct data elements.
- Developed SSRS reports based on SQL Server Stored procedures.
- Identifying the need of performing ETL to bring the data into Data Mart.
- Worked on Dimensional Modeling, Star and Snow Flake schema on Data Mart.
- Developed OLTP system by designing Logical and eventually Physical Data Model from the Conceptual Data Model.
- Validating the Data Mart using SQL queries and communicating to the ETL developers about the data requirement.
- Worked on designing a Star schema for the detailed data marts and plan data marts involving confirmed dimensions.
- Developed dashboard using SSRS using Charts, Data Bars, Conditional Formatting, and Advanced Reporting Calculations.
- Developed Summary reports, Trend Reports, Aging Bucket Reports to provide Monthly, Quarterly and Yearly Aggregates.
- Developed transactional level detail reports to provide most granular level details.
- Reengineered data model, SQL queries and reports.
- Creation of tables, data relationships, and advanced queries/reports, as well as modifying existing tables, data relationships.
- Developed Excel Reports to provide report prototypes to the users.
- Maintained versions of Reporting Project Code, (Business Intelligence Development Studio Projects).
- Maintained the user-friendly organization and security of reporting portal.
- Developed and maintained SQL Server Stored procedures to support reporting.
- Designed the reports to convert the reporting requirement into most effective report form.
- Presented reporting sample to the users to get sign off on detail reporting requirement.
- Provided different options to the users to choose from and to select one to perform report development.
- Prepared technical documentation for data definition of each metric and process.
- Involved in migration of SQL Server platform from SQL 2005 to SQL 2008 platform.
Environment: MS SQL Server 2008, SQL Server Reporting Services (SSRS), SAP, SQL Server Integration Services (SSIS), Business Intelligence Development Studio (BIDS), OLAP, XML, DTS, T- SQL.
Confidential
Data Analyst
- Going through the Requirements and raising for clarifications.
- Created database, tables and procedures to update and clean the old data.
- Strong RDBMS skills and hands-on experience in SQL Server performing SQL and PL/SQL programming.
- Wrote SQL scripts to Insert, Update and Delete data in MS SQL database. Normalization and De-Normalization of tables
- Developed Backup and Restore scripts for SQL Server 2008.Installed and Configured SQL Server 2005 with latest service packs.
- Created SSIS Packages to import and export data from Excel , XML and Text files.
- Transformed data from various data sources using OLE DB connection by creating various SSIS packages.
- Analyze data, pull reports and manipulate the data focused on fulfilling the end user’s requirements; utilize Micro strategy to export the report to excel using pivot tables etc.
- Created schemas, views, stored procedures, Triggers and functions for Data Migration.
- Formulating SQL queries and stored procedures on assigned modules.
- Wrote stored procedures and T-SQL code to accomplish various tasks.
- Worked on MS SQL Server 2008-R2 and on Visual Studio 2008(SSIS/SSAS/SSRS).
- Troubleshot SSIS package issues like Data Type, Meta-data, Dynamic file or table names, Flat-file or Excel format setting etc.
Environment: SQL Server 2012/2008 Enterprise, T-SQL, Microsoft Reporting Service, MS PowerPoint, MS Outlook, MS Excel , MySQL, Business Intelligence Development Studio.