We provide IT Staff Augmentation Services!

Senior Data Quality Analyst / Mdm Data Analyst Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Experienced Data Analyst, with expertise in Data Analysis, Business Intelligence Solutions, Data Quality Analysis, Data Profiling, MDM, Database development, Data Integration. Experience working with a variety of complex data from domains such as Finance, Banking, Anti Money Laundering, Logistics, Risk Management, digital marketing etc.
  • Hands on experience in Data Warehouse/Analytics product development experience covering all phases of SDLC including using both Agile and Waterfall Methodology. Experienced as Talend/Bigdata Developer with good knowledge in Hadoop ecosystem technologies. Experienced on Talend Bigdata Integration Suite for Design and development of ETL/Bigdata code and Mappings.
  • Experience in ETL development, Business Intelligence, dashboard reporting solutions, star and snowflake schemas, LDM design and application integration. Highly experienced in various aspects of Enterprise Data Management (EDM), Data Cleansing and Data Profiling. Experienced in migration and conversion of data to/from various data sources.
  • Strong experience in the Anti - Money Laundering domain, well versed with AML regulations and regulatory procedures. Highly experienced in all aspects of Know your customer (KYC), Customer Due Diligence, Behavior Detection. Well versed with Sanction Screening and Transaction Monitoring process and List Management of suspects.
  • Experience in database design, logical and physical modelling, writing stored procedures, triggers, views, functions in Oracle, Sybase ASE, MS SQL Server, IBM Db2 MySQL, SQL, T-SQL and PL/SQL. Highly skilled in database Performance & Tuning and Optimization Techniques.
  • Extensive experience including applications development, operations and technical support. Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
  • Experience in finance, banking, market research, Anti Money Laundering, Risk Management & Assessment, Payment Processing Gateway, logistics, digital marketing domains.

TECHNICAL SKILLS

ETL Tools: Talend Suite 5.x/6.x, SQL Server Integration Services (SSIS), Informatica PowerCenter7.x/8.x/9.x/ 10.x

BI Tools: Tableau, Power BI, SSAS, SSRS, Qlikview, Business Objects, Crystal Reports, Omniscope

Data Quality Tools: Dataflux, Informatica IDQ, Informatica Developer 10.x, Address Doctor, Trillium, Talend EDQ, Oracle Enterprise Data Quality (OEDQ)Databases Oracle 10g/11g, MS SQL Server 2005/2008/2012 , Mongo DB, Sybase ASE 11.5/12.x/15.x, IBM DB2, Mainframe, IMS, My SQL, MS Access

Tools: Mantas, Actimize, Embarcadero DB Artisan, Rapid SQL, Toad, TAC, Autosys, Control M, Gupta SQL, Power Builder, Oracle Forms & Reports, Sybase Open Client, SQL*Loader, SQL* Plus, Erwin, ER Studio, Power Designer, Visual Source Safe, Clearcase, Winrunner, MS Office, MS Project, Visio, Remedy

Languages: SQL, PL/SQL, TSQL, Transact SQL, NoSQL, SAS, Unix Shell Scripts, Perl, VBA, VB Script, VB 5.x/6.x, Excel Macros, C, XML, HTML, XUL, ASP.NET, Javascript, J2EE

Operating Systems: Sun Solaris, Linux, Windows NT, HP-UX, UNIX, MS DOS, Mac OS X

PROFESSIONAL EXPERIENCE

Confidential

Senior Data Quality Analyst / MDM Data Analyst

Responsibilities:

  • The project involves the integration of data from multiple source systems (DB2, Oracle) in the organization into the Master Data Management platform, through Informatica ETL layer for the source systems and MuleSoft layer for the downstream consuming systems.
  • Worked extensively on the data analysis of over 250 plus data attributes in the existing MDM system for the Location, Customer, Vehicles and Fleet domain data.
  • Conducted data profiling on the existing data using Informatica IDQ to determine the current status of data in the Location, Customer, Vehicles and Fleet domain.
  • Based on the profiling results, drilled down deeper into data attributes to determine the correctness, completeness and accuracy of the data.
  • Participated in multiple meeting with the business users and the business analysts to conduct in-depth discussions related to mapping and requirement documents
  • Looked into the existing requirements and source to target mapping documents to determine and verify whether the mapping and logic is correctly defined or not and recommended corrective actions based on the observations.
  • Analyzed the data in source systems in DB2 and Oracle to check the status and accuracy of data in respect to the target EBX MDM systems.
  • Worked on a consistent basis with the Data Owners, Business Analysts, Informatica developers and other stakeholders on the appropriate data attributes in the EBX and their transformation from the source system to target through Informatica ETL process.
  • Worked with the InFact team to make sure that all the data elements to be consumed by the downstream systems like EAM, Rates ResRent etc, are present in the EBX JSON layer.
  • Discussed the profiling results with the Data Owners, stakeholders and all the relevant parties to apprise them of the pain points and anomalies in the data.
  • Prioritized the data attributes, based on the impact on the downstream consuming systems and impact on the business overall to cleanse and fix the data.
  • Created scorecards in IDQ, with appropriate business rules to gauge the status and progress of data through multiple iterations of MDM data population.
  • Conducted multiple profiling runs to compare the stats for the data in multiple iterations to determine the progress and improvement of the data overall.
  • Created Tableau reports and dashboards by importing the data from the profiling runs and the MDM tables to create visualizations and charts to display and highlight the data improvement and to display stats and metrics related to various data attributes.
  • Worked on cleansing the address fields like address, city, state, zip in the MDM system using the Informatica address doctors to identify, cleanse, standardize and enhance the overall data quality in the address fields.
  • Worked on the data analysis and fixing the data for over 35 plus key indicator fields in the MDM. Analyzed their logic for population and fixed the logic if needed and deployed the fix to populate the corrected data.
  • Worked on cleansing, standardization and correction of various other fields like Phone Numbers, Id’s, Names, OAGs, Latitudes, longitudes etc.
  • Worked with the data owners and other key stakeholders to come up with a set of business rules to be implemented in the Informatica IDQ layer and Informatica to monitor the quality of data and filter them on basis of their categories for further action.

Environment: - MS SQL Server 2017, Oracle, Toad, DB2, EBX MDM, Informatica IDQ Analyst, Informatica Developer 10.x, Tableau 10.x, Informatica PowerCenter 10.x, Address Doctor, Mulesoft, Azure, Excel, VBA

Confidential

Senior BI Analyst/ Data Analyst

Environment: - MS SQL Server 2017, Power BI, Power BI Services, T-SQL, SSRS, SSIS, Google Analytics, Survey Monkey, Dataflux, DAX, O365 Adoption Pack, Mongo DB, Azure, NoSQL, Excel, VBA

Responsibilities:

  • The project involves creating extensively informative reports and dashboards based on the United Nations data related to its various development programs, publications and yammer usage.
  • Interacted with the users and conducted meetings to understand the report requirements, to know their Business views and the level of detail needed by them and worked on data mappings.
  • Involved in discussions with various users and Program managers to determine what kind of figures and statistics they exactly want to see in the reports and dashboards.
  • Created several report mock-ups for the users to demo and finalize the requirements and conducted discussions with the admins about procuring the data from different data sources.
  • Creation of multiple SSIS packages to import data from various systems, and perform transformations on them to make it suitable for consumption downstream.
  • Created multiple SQL scripts, stored procedures, functions etc to extract and process the data from data sources and transform them to bring it into the right state.
  • Analyzed of data in multiple data sources and a performed a thorough data quality analysis and profiling to identify anomalies and inadequacies in the data.
  • Performed standardization, enrichment, scrubbing and cleansing of the data to homogenized them and bring them into a consumable format.
  • Imported data from various platforms such as Google Analytics, Survey Monkey, Yammer and Poll Daddy etc. and formatted their data using API’s and data from secured folders, using Power BI import and query utilities.
  • Used Power BI power query to extract data from external sources and modify data to certain format as required for consumption in the dashboards.
  • Mapped the old user responses to an old survey to a new survey, by categorization of data and matching them, based on the tags provided by the relevant data owners.
  • Created multiple highly interactive reports and dashboards in Power BI, which enabled various users, managers and program heads to see the performance of various development programs and how the users responded to various publications.
  • Created various measures to perform complex calculations to report key metrics and KPI’s on yammer, knowledge board and programs reports and dashboards.
  • Created reports and dashboards with complex visualizations like clustered, stacked column charts, pie charts, douut charts, gauge, maps, funnel, dial gauge, word cloud etc.
  • Displayed various metrics like topics discussed, most downloaded publications, most read and liked pages, Organization types downloading the publications, reasons for downloads, total events, sessions, views etc.
  • Created reports with data and metrics about various aspects of the development programs as wells as publications, which could be sliced and diced from various aspects and drilled down to multiple levels.
  • Created reports with filters like last N days, Year and months to give users the flexibility of choosing the data segment as per their requirement.
  • Created visualizations based on the key words that were mentioned in the feedbacks and the topics being discussed, to determine the trending words and topics, and performed sentiment analysis to gauge mood of the users.
  • Created various dashboards by pinning key visualizations from various reports, as well as by pinning live pages to display the whole report as well as the slicers.
  • Scheduled data refresh for Power BI reports and dashboards on periodic basis to enable the display of the most current data on the dashboards.
  • Deployed and published the Power BI reports and dashboards to share them with the appropriate users. Created multiple roles, with appropriate privileges for different users.
  • Involved in performance tuning and optimization of SQL queries and the reports. Troubleshooting of the issues and coordinated with various teams for bug fixes and deployment of the report.
  • Created tasks to track the bugs, fixes and their resolution, which helped in improving the overall development time.
  • Involved in design, development, testing and support of the BI reports. Involved in debugging and monitoring, and troubleshooting performance issues during the release and the weekly runs.

Confidential

Senior BI Analyst/ Data Analyst

Environment: - Power BI, Power BI Services, MS SQL Server 2017, Mongo DB, SSRS, SSIS, T-SQL, ER Studio, Robo Mongo, Mongo BI Connector, DAX, NoSQL, SAS, Excel, VBA, JIRA, Autosys, Unix, Shell Script, J2EE

Responsibilities:

  • The project involves upgrading the current SMART system for the city from legacy system to a web based application, creating reports and dashboards.
  • Involved in different aspects and development stages of the overall solution and database architecture and setting up of the development and testing guidelines.
  • Participated in daily stand-ups, sprint planning sprint reviews and sprint retrospective on a regular basis and reviewed tasks and user stories and provided estimates in JIRA.
  • Interaction with the users and conducted meetings to understand the report requirements, to know their Business views and the level of detail needed by them and also worked on data mappings.
  • Performed a few proofs of concepts for the reports to determine the right platform. Created several report mock-ups for the users to demo and finalize the requirements.
  • Creation of multiple SSIS packages to import data from the legacy (mainframe) system, Oracle and Mongo DB to target SQL Server DB for report consumption and other use.
  • Analysis of data in the existing databases and a thorough data quality analysis and profiling to identify anomalies and scrub and clean the data.
  • Created SQL scripts, stored procedures, common table expressions (CTEs), functions and ETL flows to extract, clean, scrub and load the historical data in the existing system to the target tables. Creation of data lakes and data marts.
  • Creation of multiple, very complex reports in SSRS and Power BI, that run on high volume of data, with response time less than a few secs and which pulls data from SQL Server as well as Mongo DB.
  • Creation of multiple highly interactive Power BI and SSRS reports for supervisors and managers, which enables them to see the current status of the work, shift allocation for the next day at location, district and borough level.
  • Development of key reports in SSRS which allowed end users to understand the data on the fly with the usage ofquick filtersfor on demand needed information.
  • Generation of periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
  • Development of various operationalDrill-throughandDrill-downreports usingSSRS. Developed different kind of reports such aSub Reports,Charts,Matrix reports,Linked reports.
  • Creation of SSRS reports for consumption by the various user groups, which they use, to assess the trends and overall status of the tasks, shift capacities and manpower allocations.
  • Scheduling of data refresh onSSRS reports on weekly and monthly increments based on business change to ensure that the views were displaying the changed data accurately.
  • Extraction and importing of data from MongoDB and configuration of the MongoBI connectors and the ODBC driver to enable smooth communication between MongoDB and SSRS and PowerBI
  • Created multiple Power BI dashboards and data visualization reports for DSNY supervisors and users to to give them zone wise availability of the resources.
  • Generated dashboards with key indicators for the daily planning team to enable them to plan the resources, in term of tasks, routes, personnel and their allocations.
  • Used Power BI power query to extract data from external sources and modify data to certain format as required for consumption in the dashboards.
  • Scheduled data refresh for Power BI dashboards on periodic basis to enable the display of the most current data on the dashboards.
  • Deployed SSRS reports and published Power BI dashboards to share them with the appropriate users. Created multiple roles, with appropriate privileges for different users.
  • Involved in performance tuning and optimization of the complex SQL queries and stored procedures. Created appropriate indexes, scheduled jobs for updating stats on the tables to keep the queries efficient.
  • Involved in troubleshooting of the issues and coordinated with the QA team and other teams for bug fixes and deployment of the report. Created tasks to track the bugs, fixes and their resolution, which helped in improving the overall development time.
  • Involved in design, development, testing and support of the BI reports. Involved in debugging and monitoring, and troubleshooting performance issues during the release and the weekly runs.

Confidential

Senior BI Developer/ Data Analyst

Environment: - SAS, Talend Suite 5.x/6.x, Oracle, DB2, SQL, Toad, Tableau 9.x, SAS Dataflux, T-SQL, MS SQL Server 2012, ER Studio, Excel, VBA, SSIS, SSRS, IBM Data Studio, Google Analytics, Sharepoint, Autosys, Unix, Shell Script, J2EE

Responsibilities:

  • Involved in the implementation of the Campaign Connect project for Confidential Cross Confidential Shield of Florida, which would replace the existing campaign system and which would accommodate multiple tenants.
  • Involved in the initiative to convert and enhance the current Marketing campaign process in SAS to a setup that includes Talend, Tableau, SPSS, SQL Server, DB2 etc, in an agile environment, with 2-3 weeks sprints.
  • Analyzed the existing flows created in SAS for the Marketing Campaign for BCBS Florida and Guidewell Connect to determine the exact functionality and the data elements and to create source to target mappings to be used in the new ETL flows to be developed in Talend.
  • Analyzed data in the multiple sources in Campaign Connect to perform data profiling and quality analysis using Talend EDQ and Dataflux to to determine the “as-is” status of the data and to identify issues.
  • Conducted content and relationship discover to determine the quality of the content and how they are related. Compiled the profiling and discovery results in form of reports and presented to the business for discussion.
  • Performed standardization, enrichment, clustering and cleansing on the data to standardize the format and make in consumable and efficient for the matching process downstream. Implemented complex data quality rules to catch anomalies in the data and dump them into error tables and configured alerts.
  • Interacted regularly with the key business stakeholders and the business team to conduct discussions about the data quality results, exact business requirements and development efforts for each sprint.
  • Participated in daily stand-ups, sprint planning sprint reviews and sprint retrospective on a regular basis and reviewed tasks and user stories.
  • Created Talend flows, SQL scripts, procedures etc to extract, clean, scrub, validate and load the historical data to the target tables.
  • Created scripts and Talend flows to do data quality checks, transform and load healthcare and enrollment data coming in from various sources such as flat files (delimited, positional), SAS data sets, Salesforce, Oracle, SQL Server, Excel etc
  • Setup the flows and ftp framework to automate the sending and reception of the key demographic files to and back from the vendors pre- and post-processing and cleansing of the files.
  • Used different components in talend like tmap, tdb2input, tdb2output, tfiledelimited, tfileoutputdelimited, tunique, tFlowToIterate, tintervalmatch, tlogcatcher, tflowmetercatcher, tfilelist, taggregate, tsort, tFilterRow to design complex flows.
  • Created Talend flows to perform data checks, standardizing, matching and clustering across multiple sources and perform Individuation, Householding and Deduping on the data and create the Golden Record from amongst records.
  • Developed Test cases to perform unit testing of the database scripts, Talend and SSSI etl flows and transformations, to validate the developed code and coordinated with business users in performing UAT.
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
  • Automated multiple jobs involved in monitoring, support and troubleshooting for Talend jobs and scripts scheduled through Talend Admin Center (TAC). Create Triggers on TAC server to schedule Talend jobs to run on server.
  • Migrated the Talend flows and jobs from 5.x to 6.x version. Created backups, migrated over all the jobs, and related functions, routines, database objects over to the new environment.
  • Tuning and optimization of the Talend flows to reduce the overall run time drastically and handle huge volumes of data and to handle potential volume rise over the period of time.
  • Development of high level data dictionary of database objects, data elements, Talend etl data mappings and transformations for reference purpose. Interacted with business for clarification of any doubts about any elements.
  • Converted existing SSRS reports and created new dashboard reports in Tableau for consumption by the various user groups, which they use, to assess the trends and performance of marketing campaigns.
  • Created multiple interactive Tableau dashboards and data visualization reports for Responders and customers response to the campaigns and analysis of products and product lines.
  • Generated Tableua dashboards with filters, parameters and sets to handle views more efficiently. Created advanced chart types, visualizations and complex calculations to manipulate the data. Connected Tableau server to publish dashboard to a central location for portal integration.
  • Create SQL scripts, complex stored procedure, provide database support, data analysis and scripts for data cleansing and scrubbing etc
  • Analysis of the business requirements for the Marketing campaign process and update the requirement documents and specification in accordance with the updated businesses needs and process.

Confidential, Tampa, Florida

Senior BI Developer/ Data Analyst

Environment: - Talend Bigdata Suite 5.x, Tableau 8.x, Oracle 11g, SQL, T-SQL, Toad, MS SQL Server, Cloudera 5.x, SAS, PL/SQL, MS Access, Excel, Macros, VBA, VB.NET, Sharepoint, NoSql, C, Autosys, Unix, Shell Script, J2EE

Responsibilities:

  • Involved in development and enhancements to the key Loans and Securities lending application to improve the overall performance of the system and help in onboarding of new clients onto the lending system.
  • Involved in various development and analytical initiatives, related to the project Fusion which will merge the SecLending and OpenLend application to provide a single portal for Loans and Securities Lending operations.
  • Conducted review of the database objects such as tables, views etc. to determine the status “as-is” and to determine scope of improvement in the overall design from the performance point of view and provide recommendations.
  • Performed Data profiling and data quality analysis using dataflux, of the existing data and data coming in from the other businesses, using, in order to assess the overall standard of the data and publish the results on SharePoint for everyone’s consumption
  • Identified the field/column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the mappings using dataflux. Identified and documented the source to target mapping data lineage.
  • Held discussions with key stakeholders and users about the dataflux data quality and profiling results, got feedback about specific data quality checks and provided suggestions. Conduct data cleansing and scrubbing for further loads.
  • Created multiple Talend flows to connect the Securities Lending system to the CitiRisk Qualification system, which processes large amount of receivables daily and performs credit risk analysis for the customer receivables given as collateral.
  • Created Talend flows to load and transform the data from various sources like flat files (delimited and positional), Oracle, XML, MS SQL Server, MS Access and excel files.
  • Created Talend flows to handle positional files that form a record by joining multiple records and format the data conditionally.
  • Used different components from Talend like tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tLogCatcher, tSetGlobalVar, thdfsinput, thdfsoutput, tFilterRow, ts3put, ts3get, tredhsiftinput, tredshiftoutput etc to create complex jobs.
  • Developed mappings to load Fact and Dimension tables, for incremental loads and unit tested the mappings
  • Developed Test cases, Use cases and involved extensively in testing NoSql queries, business transformation, writing test cases and validating the requirement with the developed code.
  • Migration of huge amounts of data from different databases (i.e. Netezza, Oracle, SQL Server) to Hadoop. Importing and exporting data into HDFS and Hive using Sqoop.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Monitoring and maintenance of multiple automated data extraction jobs and daily Talend jobs, which get high volume feeds and files from various locations daily.
  • Coordinated and managed the development, testing and support activities between the onshore and offshore teams and ensured proper communication in order to facilitate smooth operation and resolution of issues.
  • Created tables and Talend framework to track various aspects of the execution cycle of the jobs and automate them using a combination of scripts and scheduler on times based or event based schedule.
  • Involved in troubleshooting of day-to-day issues and worked with the users and team towards resolution of issues.
  • Created multiple design and enhancement documents for the changes and guided the SIT and UAT teams in testing the full functionalities and nuances of the changes.
  • Created Tableau reports which would filter information by metrics such as loans outstanding, collateral by collateral type, top securities by earning, top borrowers by loans balances etc.
  • Worked on creating Tableau dashboards which would provide snapshots of the critical portfolio data which could be used for portfolio analysis and risk mitigation.
  • Prepared Tableau Dashboards using calculations, parameters. Worked on the development of Tableau dashboard reports for the Key Performance Indicators for the top management.
  • Development and maintenance of various stored procedures, SQL and job scripts, in order to accomplish processing of data and facilitate complex transactional and reporting functionalities and format the data conditionally.
  • Identified bottlenecks in the jobs and created appropriate indexes wherever needed, reorg or recreate them if necessary and resolved data contention issues.
  • Created multiple design and enhancement documents for the changes and guided the SIT and UAT teams in testing the full functionalities and nuances of the changes.

Confidential

Senior Data Analyst/Business Analyst

Environment: -- Oracle 11g, DB2, PL/SQL, T-SQL, Mainframe, Informatica 9.x, Informtica IDQ, Toad, Oracle Customer Data Hub, Sybase 15.x, J2EE, WinSCP, Putty, ER Studio, Unix Shell Script, Linux

Responsibilities:

  • Development and support of the key data integration and migration initiative for the conversion, integration and migration of data from different legacy systems to the central database system, from where it will be consumed for further business decisions.
  • Interact with legacy businessusersto understand business process flow, business logics and to assess major data objects, data volume and level of effort required for data conversion and migration.
  • Work with business users/owners to gather data migration requirements, design datamapping documents,and define business rules forcleansing, and transformation requirements.
  • Documentation and update of the business requirements in close coordination with data stewards and key stakeholders about the ongoing approach and mapping of the objects and data elements across the systems.
  • Conducted in depth CRUD analysis of the data from multiple systems, to determine the high- impact database objects, tables and to determine the data flow and potential area of improvements.
  • Conduct in-depth Data Profiling and Data Quality Assessment of the data in the Legacy Systems, to determine the current level of data accuracy, conformation to standards etc.
  • Analysis and comparison of the data across different sectors to check for data consistency, accuracy and relevance of the existing data. Data profiling on data for reference, waybill and shipping data and document the discrepancies observed in the data.
  • Write multiple SQL and PL/SQL scripts, stored procedures etc. to process and standardize the data and conduct checks across different databases to determine referential integrity and consistency of data.
  • Discussions with data architects, business stakeholders about the data quality results and have discussions about the data standards, and the kind of business rules to ensure efficient data quality standard in future.
  • Develop framework to extract and dump the data from various legacy sources into the staging area and create scripts and setup to extract, parse, cleanse and massage the data to standardize it.
  • Creation of the data structure and staging area for dumping the legacy data and inter-linking all the segment data for in-depth analysis and closer look.
  • Creation of extraction files from various data sources, dump it into a staging area and use the script to parse data out of the files and dump it into the staging tables.
  • Develop scripts, matching rules and transformations to compare data across systems and harmonize and massage the data to do cleansing and identity and eliminates the duplicates.
  • Conduct migration and integration the data from multiple data sources, into the target tables using the extract, transform and load packages.
  • Conduct post migration check and acceptance testing for validating the quality, accuracy and consistency of the data in the target database

Confidential

Senior BI Analyst/ Digtial Marketing Analyst

Environment: - SSIS, SSRS, Tableau, SSAS, DataFlux, Good Data, SSIS, DQS, MS SQL Server 2008 R2/2012, T-SQL, Oracle 11g, Crystal Reports, MS Access, Excel, VBA, Sharepoint, Business Objects, Amazon AWS, XML, Erwin

Responsibilities:

  • Development, support and maintenance of key Business Intelligence and data reporting solutions by using digital marketing and demographical data provided by the clients, enabling them to make better business decisions for future purposes.
  • Implementation and delivery of business solutions to develop and deploy ETL, analytical, reporting and scorecard / dashboards in the area of digital, social media and competitive data from Google Analytics and Adobe.
  • Involved in creation/review offunctional requirementspecifications and supporting documents for business systems, interaction with Business Analysts, Client team, Development team etc.
  • Data quality analysis and Data profiling of the various data sources to determine adherence to standards. Perform data scrubbing, cleansing and standardization to prep data prior to the load.
  • Interaction with the development teams and business analysts to create and modify the data model and LDM (Logical Data Model) for publishing multiple dashboard reports on web and cloud platform in accordance with expectation of the clients.
  • Development of complex, intuitive dashboard reports for the display of key business data on Tableau and GoodData.
  • Developed reports and dashboards for the CMO & Director of Marketing (using Tableau, Excel and SQL) that measured the effectiveness of inbound marketing campaigns (DRTV / PITV / Radio / Warm Transfer / Email / SEO / SEM / Social).
  • Created reports measuring key performance indicators and communicated findings through company.
  • Communication with key stake holders, Client Sales Managers (CSMs) and end users in reference to business requirement and specifications for the latest dashboard reports and BI solutions.
  • Creation and modifications of the fact and dimension tables in the star schema in accordance with the BI Data analytics and reporting solutions.
  • Coordination of design and implementation of key dashboard reporting platforms based on social media, digital media and competitive data.
  • Coordination of the on boarding of new clients onto the dashboard reporting platforms and managed their expectations.
  • Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments (Development and Production).
  • Development and implementation SSIS packages to extract and load data from various heterogeneous data sources like Excel, CSV, Oracle, MS Access, SQL Server, flat files etc.
  • Development and deployment of SSIS packages with tasks and transformations like Execute SQL, Execute Package, Conditional split, Script Component, Multicast, Merge and Lookup to load the data into Destination.
  • Work on Performance tuning and Optimization of ETL packages at Control Flow and Data Flow level along with making proper use of transactions and checkpoints.

Confidential

Senior Data Analyst/Developer

Environment: - Oracle 10g/11g, Sybase 12.x, SQL, PL/SQL, T-SQL, DB Artisan, SSIS, Rapid SQL, Sun Solaris, Oracle Mantas, Crystal Reports 11.0, SQL Server 2008, J2EE, Trillium, Advance Query Tool, Secure CRT, Erwin, Unix Shell Script, Sun Solaris.

Responsibilities:

  • Development, support and maintenance of the key screening and monitoring applications in Anti Money Laundering unit at Confidential, used for sanction screening and transaction monitoring in accordance with the Patriot Act and Bank Secrecy Act (BSA).
  • Involved in development and update of the Function specification and Technical specification documents in accordance with the latest business requirements and enhancements and regular discussions with the key stakeholders.
  • Involved in analysis and audit of the data and documents from various units to ensure application integrity, accuracy, consistency and completeness from Anti Money Laundering (AML) guidelines perspective.
  • Data profiling and data quality checkup on the data for multiple units to ensure it met the specified business rules and standards set within the organization.
  • Document and summarize the findings in reports, specifying the corrective actions. Work on the detailed documentation of the KYC and customer due diligence (CDD) guidelines.
  • Documentation of procedures to maintain and update multiple lists of designated people (SDN etc.) for the AML and compliance unit. Assisted in optimization of the matching process.
  • Work on automation and updates of the list of suspect individuals and organizations, provided by Organization of Foreign Assets and Control (OFAC), EU, World-Check etc.
  • Conduct in-depth analysis of the matching logic and take steps for optimization and tuning of the process to determine the optimum threshold level, to minimize the false positives.
  • Development and maintenance of complex SQL scripts, stored procedures, triggers, functions, indexes and other database objects to validate and load the data into the application.
  • Performance tuning and optimization of queries, stored procedures, scripts and in-depth analysis of indexes and their tuning in order to improve the overall speed and efficiency of the system.
  • Creation of scripts and process for dropping the data in staging tables and for cleansing and massaging the data prior to the next stage.
  • Development and update of the ETL packages, tasks, workflows etc. and development of the multiple transformations to extract, transform and load data from various data sources like flat files, excel, CSV, MS Access etc.
  • Monitoring and maintenance of the data extractions jobs, data feeds and overall workflows in the ETL process and troubleshooting issues related to that.
  • Resolve AML related issues to ensure adoption of standards, guidelines in the organization. Resolution of day-to-day issues and worked with the users and testing team towards resolution of issues and fraud incident related tickets.
  • Development of a Data Quality framework for monitoring of Data Quality and accuracy which had in built alerts for data discrepancy. In depth analysis for data fields and performed data cleansing, standardization and enrichment on the data.
  • Enhancements to the NSCC Liquidity projects, purpose of which is to enhance the current Liquidity profile and to make liquidation adjustments to VaR (Value At Risk) and other coverage components.
  • Conduct meetings with the users’ point of contact, to collect requirements and to analyze and elaborate the requirements and expand them into a tentative design document.
  • Conduct regular review of the Function and Technical Specifications to reflect the latest changes and enhancements.
  • Manage the development team activity onshore as well as offshore in India, including daily monitoring, status reporting provide, technical guidance and mentorship.

Confidential, Tampa, Florida

Senior Data Analyst/Business Analyst

Environment: - Oracle 10g/11, MS SQL Server 2005/2008, Toad, SQL, PL/SQL, T-SQL, DTS, SSIS, SSRS, DataFlux, SAS, Business Objects, Mainframe, Sybase 12.x, SQL Navigator, SQL Profiler, Mantas, Actimize, MS Access 2007, Excel Macros, VBA, Crystal Reports 11.0, Windows Vista, Unix, Shells Scripts, Sun Solaris, J2EE

Responsibilities:

  • Support and development of the key anti money laundering applications for the sanction screening and transaction monitoring process, to ensure they are in compliance with the US Patriot Act and Bank Secrecy Act (BSA)
  • Support and maintenance of key Anti Money Laundering (AML) applications, which manage data related to US Sanctions and Specially Designated Nationals (SDN’s) from OFAC (Office of Foreign Assets Control), USA Patriot Act 314(a) and transaction screening process.
  • Provide Business as Usual (BAU) support and Production support for the NESS application and perform data analysis. Provided training to the users as well as developers and bring them up to speed on different aspects of AML and Compliance.
  • Designed the test cases, test scripts and coordination of testing activities for various NESS application releases. Co-ordination on regular basis, with off shore team to make sure smooth execution on testing, development and training front.
  • Documentation of procedures for creation, analysis and maintenance of various Global, Local and Regional Lists.
  • Involved in development and maintenance of Hot Lists and SDN Lists and as requested by OFAC (Organization of Foreign Assets Control).
  • Development, maintenance and replication of various Global, Local and Regional Lists of sanctioned people and entities, which helps Citi regional offices around the world, maintain strict standards in terms of financial transactions.
  • Conducted research, assisted and provided information to support suspicious activity reports (SAR) and made recommendations. Worked on various aspects of AML, Fraud prevention, KYC, Brokerage and Trading Compliance.
  • Define the review procedures for disposition of cases and identify the issues with data quality and accuracy and rectify it.
  • Development and tuning of the Optimization Script in the database, in order to reduce the rate of the false positive hits generated by the system and streamlining the overall screening and filtering process and improve the overall performance.
  • Development and tuning of the database script for performing the hit rate analysis of the list entries against the transaction files.
  • Guided the automation of Gap Analysis process and preparation of Lists using MS Excel, Macros, MS Access, SQL and SAS.
  • Obtained a wide selection of client documentation from internal and external data sources in order to extract key information and validate client’s identity and reputation in order that they meet KYC checks
  • In-depth Data quality analysis and Profiling of the input data, of the business units and new businesses within Citi Group, scheduled to come on board the AML applications, to assess issues with data quality, using DataFlux.
  • Designed the framework for the DataFlux Architect and Data Flux Profiler jobs for the extraction of data from various data sources flat files, MS SQL Server, Oracle, Excel and COBOL Copybook files.
  • Development and maintenance of a profiling report application in Crystal Reports in accordance with the GAML guidelines, which provides the details, frequency distribution and pattern analysis for all the data elements for various business sectors in Citi such as CitiFinancial, Citi Auto and FDR.
  • Discussion of the Data Quality and profiling results with the key stakeholders and GAML and recommend the measures to rectify, fix and cleanse the data.
  • Development of a reporting solution for data profiling, using DataFlux, MS SQL Server and reporting tools.
  • Enhancements to the current applications to add complex functionalities as well as performance tuning and optimization of the application for better speed and matching accuracy, in view of addition of new business units and clients.

Confidential, Orlando, Florida

Senior ETL Developer/Data Analyst

Environment: - MS SQL Server 2000/2005, Power Designer, DTS, SSIS, SSRS, Windows XP, Trillium, Rapid SQL, SQL Profiler, SSAS, Crystal Reports, Business Objects, VBA, TFS, Excel VBA, Excel Macros, MS Access, Windows NT server.

Responsibilities:

  • Involved in the development and maintenance of Business Intelligence solutions for data analysis and reporting of the demographical and business data of regional banks.
  • Identified the data sources and created source to target mapping documents to be used by the packages in SQL Server Integration Services (SSIS).
  • Design, development and testing of SSIS packages, data flow transformations for the extraction, transformation and loading (ETL) of data from various data sources such as flat files, SQL Server, Access, CSV’s etc.
  • Creation and development of data flows and work flows for staging, manipulating and loading the data into the target data warehouse.
  • Development of SSIS packages usingLookup Transformations, Merge Joins, Fuzzy Look Ups, and Derived Columnswith Multiple Data Flow tasks.
  • Used different Control Flow Tasks and Data flow Tasks for Creating SSIS Packages. Used different types of Transformations for Data Conversion, Sorting and data cleansing from different Sources into Company Formats.
  • Data Quality analysis and profiling of the input data. Design and development of data cleansing and standardization process for the customer demographical and marketing campaign data.
  • Designed large number of reports using table filters, single value parameters, multi value parameters, dependent parameters and Cascading Parameters.
  • Develop reports usingcomplex TSQL queries, User Defined Functions, store procedures and Views.
  • Develop Cubes using SQL Analysis Services (SSAS) and Experience in Developing and Extending OLAP Cubes, Dimensions and data source views.
  • Creation of various SQL and T-SQL stored procedures, Triggers, User-defined Functions, Views for both Online and Batch requests handling business logic and functionality of various modules.
  • Design and modification of the logical and physical data model in accordance with the updated business requirements. Creation of facts and dimension tables in the star schema model and cubes for drill-down and data analysis.
  • Analysis and review of business requirement and approach for the proposed dashboard reporting and BI solutions with business analysts and key stakeholders.
  • Created various ad-hoc SQL queries for customer reports, executive management reports, trend analysis reports and sub-reports.
  • Debugging and resolution of execution errors using Data logs (trace, statistics, error) and by examining the source and target data.
  • Testing and Debugging Stored Procedures, scripts and tuning of the ETL process. Monitoring and maintenance of data extractions processes and scripts.

Confidential

Senior Database Developer/DBA

Environment: - Sybase ASE 12.5/15.0, T-SQL, Sybase IQ, MS SQL Server 2000/2005, Power Designer, Sun Solaris, Unix, DB Artisan, Rapid SQL, Business Objects, Visual Basic, JSP, Windows NT server, Java Script.

Responsibilities:

  • Development of procedures and scripts according to the Visa K and TPS10 business specifications.
  • Coding and maintenance of various Stored Procedures, triggers, tables, indexes etc. and troubleshooting database issues.
  • Coordinated the setup and onboarding of new properties, Hotels (Marriott, Hilton) and cruise-ships (Royal Caribbean, Princess etc.) properties in the Bankswitch system
  • Involved in Analysis and bug fixing for the Bankswitch application for GCS which interacts with Client and the bank software to provide authorizations and settlements.
  • Development and maintenance of the daily/weekly tasks which archive the old data to the Sybase IQ data warehouse. Development of scripts in Sybase IQ to run periodic checks on the data.
  • Performance tuning and optimization of Stored Procedures, queries and scripts, for better performance.
  • Monitored and maintained data extractions processes and scripts.
  • Made changes to various reports to make it more user-friendly and efficient.

Confidential, Boston, Massachusetts

Senior Database Developer/Analyst

Environment: - Sybase ASE 12.5, T-SQL, Power Designer, DTS, Sun Solaris, Unix, DB Artisan, Rapid SQL, Informatica Powercenter 7.1, Perl, Autosys, MS Project, DB2, Oracle 9i, Toad.

Responsibilities:

  • Created and maintained Stored Procedures, Triggers, Tables, Indexes, Rules, etc. for the FMRCO application.
  • Wrote and modified stored procedures for populating data and implemented procedures for displaying reports.
  • Created and updated various simple and complex transformations (Aggregator, Expression, Lookup, Joiner, Sequence Generator, Sorter, Stored Procedure, Update Strategy, etc.,) as part of Mapping between source and target.
  • Creation of Sessions and executing them to load the data from the source system using Informatica Server Manager.
  • Performance tuning of sources, targets, mappings and SQL queries in the transformations.
  • Involved in Production Support activities including monitoring loads, resolving exceptions and correcting data problems.
  • Conducted performance tuning on stored procedures as well as well mapping and sessions to identify bottlenecks.

Confidential

Senior Software Developer

Environment: - Sybase ASE 12.5, T-SQL, Power Designer, Sun Solaris, Unix, JSP, Java Scripting, XML, Crystal Reports 9.0, Crystal Enterprise 9.0, XUL, Html, DB Artisan, SSH FTP Client.

Responsibilities:

  • Worked on adding multiple components and functionalities to the Netscape Browser 8.0, using Java Scripting, XUL (XML User Interface) and Flash.
  • Responsible for design & development of various reusable components, reports, and back end stored procedures for data loads.
  • Wrote and modified stored procedure and scripts with new business rules, for loading the data from different processors.
  • Performance Tuning and optimizing of existing T SQL Stored Procedures, queries and scripts.
  • Developing and integrating date reports, migrating older reporting systems to Crystal Reports.

Confidential

Senior Database Developer/Analyst

Environment: - Sybase ASE 12.5, MS SQL Server 2000, T-SQL, Power Designer, Sun Solaris, Unix, Codewright 7.5, Visual Source Safe, DTS, Unix, Sun Solaris, Windows 2000/NT/XP, Perl, VB.NET, Siebel, SCCS

Responsibilities:

  • Designing, implementing and monitoring various nightly loaders which trigger various jobs and stored procedures.
  • Development, testing and enhancement of Mutual Fund Client Processing System in Sybase ASE database.
  • Involved in Analysis, Bug Fixing and Enhancements for the Reflex database systems which interacts with .NET client as well interfaces with Siebel CRM and Campaign.
  • Performance Tuning and optimizing of existing stored Procedures, queries and scripts.
  • Resolved issues related to production and data fixes and tuning.
  • Modified and worked on Perl data extraction scripts, which dump the data into the database.

We'd love your feedback!