We provide IT Staff Augmentation Services!

Associate Architect Resume

4.00/5 (Submit Your Rating)

DE

PROFESSIONAL SUMMARY:

  • 8 years of IT experience in the field of Data science, Statistics, Machine Learning, Data analysis, Data Extraction, Data Modeling, Data Validation, Data warehouse environments, software development and Project management.
  • Involved in all the stages of Software Development Life Cycle (SDLC).
  • Combined work as a data analyst and data scientist to assist the retail banking arm for the client Confidential . Utilized Unix, Python, SQL, and SAS to query, manipulate, analyze, and model data stored in various platforms including Teradata, Oracle.
  • Rigorous experience on object oriented programming (OOP) concepts using Python, and Linux.
  • Hands on experience on CSS, HTML and JavaScript.
  • Experience in using Python modules like pandas, regular expressions, numpy, scipy, matplotlib/pyplot, scikit - learn and logging.
  • Strong expertise in using Tableau software as applied to BI data analytics, reporting and dashboard projects.
  • Experience in Teradata utilities like Bteq, Fload, MLoad.
  • Substantial experience in scripting languages: Python, SAS, R, and Pig and query languages: SQL and Hive.
  • Used SQL extensively by using Joins, Sub Queries, Set Operations, and Advanced OLAP Functions.
  • Strong working knowledge of machine learning algorithms like Naive Bayes Classification, KNN Classification, Logistic Regression, Linear Regression, Neural Networks, Decision Trees, and Association analysis.
  • Programming experience in SAS/BASE, SAS/SQL, SAS/ODS, SAS/ACCESS, and SAS/STAT.
  • Experience working on various SAS procedures like Proc Freq, Proc Tabulate, Proc Report, Proc Summary, Proc Means, Proc Format, Proc Transpose and Proc Sql.
  • Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, Star and Snowflake schema modeling.
  • Extensive knowledge on data warehousing applications, directly responsible for the Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
  • Strong understanding of development processes and agile methodologies and ability to multitask within an environment of rapidly changing priorities.
  • Experience in using R packages like rodbc, outliers, missforest, ggplot2, caret, rminer.
  • Excellent communication skills, a very good team player. Ability to work independently under minimum supervision, in a fast-paced environment and under strict deadlines.
  • Act as overall technical authority for the project.
  • Handle the crude data using the latest technologies and techniques, and perform the necessary analysis based on the client’s business requirement.
  • Provide value-add analysis to business through use of visualization software to guide analysis, drawing implications from analysis, and synthesizing into clear communications.
  • Gather and process raw data at scale (including writing Python scripts, web scraping, calling APIs, write SQL queries, etc.).
  • Process unstructured data into a form suitable for analysis and then do the analysis.
  • Work with business and cross-functional teams to thoroughly document reporting processes and systems.
  • Perform strategic data analysis and research to support business needs.
  • Support business decisions with ad hoc analysis as needed.
  • Identify areas for improvement, communicating action plans to supervisor or client.
  • Automate BAU and manual tasks by building tools in Python/Pig.
  • Handle client communication regarding requirements, design, etc.
  • Gain understanding of business needs and necessary analysis where appropriate through internal discussion.
  • Review the developed code and make sure it adheres to the design, standards and guidelines of the client.
  • Support the client on estimations, and manage expectations.
  • Participate in all the client meetings related to analytics projects.

CORE COMPETENCIES:

  • Data Science
  • Statistics
  • Artificial Intelligence
  • Supervised and unsupervised Learning
  • Campaign Management
  • Team management and leadership
  • Business Intelligence
  • Analytics & Modeling
  • Big Data
  • Machine Learning
  • Web application development

TECHNOLOGIES:

  • Python
  • SAS
  • SQL
  • Hive
  • Hadoop
  • Greenplum
  • Html
  • Advanced Excel
  • R
  • Tableau
  • QlikView
  • Teradata
  • Oracle
  • GitHub
  • MS Azure ML
  • Pig

PROFESSIONAL EXPERIENCE:

Confidential, DE

Associate Architect

Technology: Python, SAS, SQL, R, Teradata, Greenplum, Oracle, HTML

Responsibilities:

  • This tool logins into Confidential internal analytics website, parses the html text, extracts the required data from various web pages, cleans the data, applies statistics on the data, creates charts, tables and emails the report to the business partners every morning.
  • Created various SAS and SQL programs to extract the Chase Freedom and Sapphire credit card customer data from Teradata and Greenplum, validated data, applied exclusions and generated text files as final outputs which are further used for marketing and servicing campaigns.
  • Analyzed the credit card transaction data of USA customers with over 90 variables and built a classification model that can identify the customers who were likely to upgrade to platinum card in next 6 months and provided the bank with greater income.
  • Analyzed the behavioral data of Chase APAC customers in e-banking and suggested the bank’s web team to refine the website content, increase the web browsers compatibility and visual impressions to improve the customer engagement. As a result, the bank’s lead conversion rate has improved by over 40%.

Confidential, RI

Analytics Consultant

Technology: SAS, SQL, R, Python, Teradata, Tableau, QlikView

Responsibilities:

  • Defined customer engagement and adoption metrics for the CVS mobile app to understand the characteristics of highly engaged, non-engaged, highly adopted and non-adopted customers. Also, the Confidential performance of various tools in the mobile application during 2017.
  • Developed SQL and SAS scripts to extract the engagement and adoption metrics data from Teradata for 2016 and 2017.
  • Defined segments for each of the engagement and adoption metrics to understand the segment wise performance of mobile app during 2016 and 2017.
  • Developed SQL and SAS scripts to create segments for each of the engagement and adoption metrics.
  • Pulled the engagement and adoption data into Tableau and created Tableau reports and dashboards to compare and visualize the 2016 and 2017 engagement and adoption metrics data for each segment.
  • Published the Tableau reports and dashboards to the Tableau server and presented the final recommendations to the higher management.

Confidential, VA

Data Analyst

Technology: Python, SAS, SQL, R, Teradata, GitHub, HTML

Responsibilities:

  • Developed a Python API which creates daily transactional, collections and business reports for front end ( Confidential home grown website called DOTIE) users by interacting with Teradata database using the SQL scripts.
  • Developed a Python parsing program that can parse SQL query of any complexity and output the column names, column alias, table names, table alias and databases queried using the SQL script.
  • Developed a Python pandas script that extracts the campaign data for retail customers from history tables daily, creates a tableau dashboard and triggers an automatic email of this dashboard to the business unit.
  • Developed a Python script to load unlimited number of records from text files to Teradata tables within seconds.
  • Developed all the python programs in UNIX environment using vi editor.
  • Committed all the python programs from UNIX to GitHub development branches for testing.
  • Responsible for debugging the project monitored on JIRA (Agile) and managed requirements and tasks using JIRA.
  • Created Continuous Build Process using Jenkins as Continuous integration tool.
  • Managed tools like Jenkins, JIRA and Performed maintenance and troubleshooting of build/ deployment systems.

Confidential, MI

Business Co-op

Technology: Python, SAS, SQL, R, Teradata, MS-Access, Advanced Excel, Tableau

Responsibilities:

  • Analyzed the 2014 energy usage (time series data) of south-east Michigan households and businesses and built a classification model in R that can predict the fraudulent customers who involve in meter tampering.
  • Analyzed the behavioral and energy consumption data of the Confidential customers from Michigan Upper Peninsula using SAS, Tableau and came up with a competitive pricing strategy to reduce the customer churn.
  • Combined various data sets from different sources into database structure, dealt with missing values and wrong input values in Python.
  • Performed data query, extraction, compilation, and reporting tasks using SQL, Teradata, MS-Access, MS-Excel.
  • Used excel sheet, flat files, CSV files to generate Tableau ad hoc reports.
  • Extensive experience in using Tableau functionalities for creating different Requests, Filters, Charts, Interactive Dashboards with page and dashboard Prompts.

Confidential, IL

Client Development Intern

Technology: Python, SAS, SQL, R, Teradata, Advanced Excel, DB2, Tableau

Responsibilities:

  • Extracted and analyzed the Burger King’s retail point of sale data from decision key ( Confidential BIG DATA store) using SAS and presented story on how Burger King is losing a segment of loyal customers from the past 6 months during breakfast hours.
  • Created a marketing mix model in Python to understand the impact of various marketing channels on the sales of US Nike footwear.
  • Performed complex statistical analysis using PROC MEANS, PROC FREQ, PROC UNIVARIATE, PROC REG and PROC ANOVA
  • Analyzed the social media campaign data of Huggies Diapers to understand the campaign effectiveness by measuring the metrics like awareness, engagements and share of voice etc.
  • Presented a business case to NPD higher management showcasing the value of location data that would benefit NPD in providing more accurate business solutions to its customers and gain competitive advantage in the market.
  • Connected SQL server to R using rodbc package, extracted and transformed the data, imputed missing values and built classification models using the R package caret.

Confidential, MI

Project Manager

Technology: Python, SAS, SQL, R, Teradata, Advanced Excel

Responsibilities:

  • Collaborated with a team of three students to develop a model for measuring customer loyalty and value for a major store in Michigan using RFM technique and clustering algorithms in SAS.
  • Worked in a group of three to analyze the demographic and behavioral data of the potential customers of a major firm in R to create marketing campaign for acquiring the potential customers.
  • Enhanced classification algorithms performance through input data transformation, feature selection and dimensionality reduction techniques.
  • Developed canned and Ad hoc reports with complex calculations and custom SQL.
  • Defining the KPIs/visualizations based on sample data and user preference.
  • Explored Machine Learning techniques and Econometrics technologies to generate evidence based business insights.
  • Conducted marketing campaigns analysis using A/B testing methodology to determine the impact of marketing campaigns on conversation rates, ROI and sales, using R.
  • Connected Teradata to Python using the library pypyodbc, extracted the data and imputed missing values in scikit-learn.

Confidential

Associate Consultant

Technology: Python, SAS, SQL, R, Teradata, DB2, Oracle, Hadoop, Pig

Responsibilities:

  • Worked in a team of three to build machine learning algorithms that can predict the APAC customer affinity towards product lines using the customer demographic, purchase, and browsing information to determine which product category has more number of loyal customers.
  • Used Python with PIG to mine and identify the outliers and anomalies in the log files.
  • Developed a onetime Python script to update the new product information in database.
  • Developed a Python program to retrieve the collection data of customers who were consistently defaulting from past 3 years.
  • Unit tested, validated all the python programs and deployed many Python bug fixes into production.
  • Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Create various Data Mapping Repository documents as part of Metadata services (EMR).
  • Business release related Support, Validation Plan, Data profiling for the RTM enrolled Merchant groups & this involves BIG DATA.
  • Extracted data sets from server using PROC IMPORT and created datasets in SAS libraries.
  • Used the Import and Export facilities in SAS to exchange data between SAS and Microsoft Office environments (Excel, Access).
  • Worked with data movement utilities IMPORT, LOAD, and EXPORT under high volume conditions.

We'd love your feedback!