- Over 8 years of IT experience in Information Technology as Analyst, Developer and Administrator.
- Good experience with Microsoft SQL server 2003/2005/2008 , MYSQL and MS access.
- Proficient in ETL Data Warehousing.
- Provided assistance for other system related programming and design tasks.
- Good knowledge in software development life cycle (SDLC) which includes Requirement gathering, Analysis, Design, Code development, Testing, Implementing and maintain software applications.
- Expertise in the concepts of Data Warehousing, Data Marts, ER modeling, Dimensional modeling, Fact and Dimensional tables with data modeling tools ERWIN and Sybase power designer.
- Used informatica power center 9.0.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Experience in Data Warehousing tools such as Informatica and Abinitio.
- Worked with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
- Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
- Expertise in various component groups like Partition, De - partition, Database, Datasets, Transform, FTP, Sort and Miscellaneous.
- ETL experience in cloud computing in AWS or Google Cloud.
- Very good experience in Oracle database application development using Oracle 10g/9i/8i/x, SQL, PL/SQL, SQL*Loader, Netezza. Strong experience in writing SQL, PL/SQL-Stored Procedures, Functions and Triggers.
- Created various adhoc reports using SAS. Extensively used DB access to Teradata and Oracle 9i using SAS SQL pass-thru facility.
- 4 years of OLTP and Data Modeling Experience
- Extensive experiences in UNIX shell scripting to automate and schedule the jobs.
- Developed access control lists to address security issues.
- Understanding in IOT and time series data analytics.
- Worked on computer assembling.
- Hardware trouble shooting experience.
- Have experience in web design using HTML.
- Have good work ethics, quick learning, self- motivation and team player abilities with good communication and technical skills.
Languages: C, SQL, PL/SQL, SAS 9, PERL Scripting and Korn Shell Scripting.
Databases: SQL Server 2008, MySQL 5.1, Oracle 10g, MS Access, Netezza 6.8
Operating Systems: Windows 2003 Server/XP/VISTA/Windows 7, UNIX/ LINUX
BI Tools: SAS 9, Business Objects 6.0
RDBMS: Teradata, Oracle 8.0/8i/9i/10g, DB2, MS Access, MS SQL Server.
Data Modeling: Erwin, Sybase power Designer 12
Confidential, San Jose, CA
Sr. Data Analyst
- Modeled the data for XRAM Hub Project
- Identified the requirements with the business stakeholders for the above projects
- Modeled the data hub for Xram Research Hub and involved in backend development for this project.
- Making sure that the data from Source to Target is Validated and Clean.
- Developed Visualizations to generate trend analysis and growth trend.
Tools: Used: Snowflake, Google Cloud Platform, Python and Power BI
Confidential, San Francisco, CA
Business / Data Engineer
- Participate in end-to-end documentation of data and reporting requirements for marketing systems projects
- Creates business user specifications for IT projects and works with developers and administrators on technical specifications.
- Identify source systems and data required to meet user’s current and potential future needs.
- Develop and execute inbound marketing, demand generation, lead management and automated digital marketing programs to drive lead conversion and ultimately new business acquisition.
- Identify source and target mappings.
- Develop future solution strategies by working closely with business and IT development teams.
- Define and validate transformation logic required to support information requirements.
- Ability to identify effective data visualization solutions to address business questions.
- Participate in solution development process to ensure solution addresses all business requirements
- Identify and implement reporting solutions, using automation capabilities when appropriate.
- Define acceptance criteria for each project.
- Worked on the databases of the Amazon rds and carried out functionalities for creating instances as per the requirements.
- Establish and coordinate user acceptance testing and plan.
- Conduct testing and validation.
- Deliver end-user training.
- Working with internal teams to resolve differences.
- Present status, milestones and issues to management and other stakeholders related to assigned projects.
- Develop the process and supporting artifacts to organize information management into data dictionary, data lineage, and other supporting documents.
- Generated complex analytics SQL queries for Google Big query for answers to client questions.
- Maintain a thorough understanding of analytics, data repositories, data models and reporting systems, including:
- Understand and document the underlying data elements, constraints, flows and sources supporting the business domain.
- Subject matter expert on business processes, data, marketing eco-system as well as data received from other sources.
- Participate in meetings with all interested groups to evaluate potential changes to the marketing data and processes.
- Utilizing Google Big Query SQL, Amazon Redshift and Tableau to build and drive reporting excellence.
- With understanding of marketing system and data needs, proactively identify gaps in data coverage and identify required enhancements and opportunities for improvement.
- Work with customers to bring clarity to translate high-level requirements into actionable and achievable products.
- Help analyze and solve day-to-day challenges with big data and analytics/reporting systems, including:
- Research data in the Hadoop and other Source systems to identify the issues.
- Resolve and/or prioritize issue resolution using the big data tools.
- Proactively identify data quality problems and support solution identification.
- Interpret highly complex physical and logical data models.
- Ability to operate as a project coordinator utilizing a cross functional/matrix team.
- Perform other job-related duties and activities as requested.
Tools: Used: JIRA, Confluence, SQL, Excel, JSON, XML, Google big query, Hadoop, HiveQL, AWS.
Confidential, Fremont, CA
Sr. Data and Analytics Engineer
- Responsible for integrating via reporting APIs of vendors that Solar city purchases.
- Making sure that the code is in sync in all the environments
- Problem solving data discrepancies and production SSIS jobs are daily activities.
- Extensively worked on data extraction, Transformation and data aggregation for analysis.
- Having capabilities to manage technical integrations with Salesforce, Atlas and Tealium.
- Data pipelines have been automated for moving over 20 GB of data from Oracle source tables to Hadoop and Google Big Query.
- Cognitive about designing, deploying and operating highly available, scalable and fault tolerant systems using Amazon web services.
- A Key Member on team building internal Marketing Data Management Platform (DMP) Connecting numerous Data Points across consumer journey such as:
- Billions of Website Display Ad Impressions
- Millions of Website Page Views
- Social Interactions
- Press Release & News Interactions
- Website Conversions
- Sales Funnel Conversion Steps
- CRM Data
- Facebook Atlas API integration Ingest Impression Click and Display Ad Metadata using REST API and Python.
- Facebook Lead Form API integration with Facebook REST API using Python and SQL Server. Leads from Facebook are then added into Salesforce.
- Built Direct Mail Data Warehouse for reporting and analysis.
- Assist with developing and implementing nurture programs within Marketo.
- Manage form processing and website integrations.
- Build and manage key marketing integrations - salesforce.com, website tools, third party marketing cloud solutions.
- Ensuring strategic and tactical alignment with complimentary marketing capabilities and activities (i.e., social marketing, telemarketing, conferences, direct mail).
- Analyze lead management model and identify areas for improvement including: lead delivery, rejected leads, recycle programs and activity metrics.
- Create data mapping and workflow using Informatica PowerCenter to extract, transform and load data into the Database target reporting environment.
- Mentor, teach, and instruct automation users in best practices of marketing automation. Work with team on core operational processes and uses of Marketo and Salesforce.
- Collaborate with sales operations on enhancing and troubleshooting issues with the lead management process and integrations with marketing automation platforms.
- Experienced with installation of AWS CLI to control various AWS services through shell/bash scripting.
- Conduct ETL optimization, troubleshooting and debugging.
- Design and develop test plans for ETL unit testing and integration testing.
- Assist in development of business case documentation.
- Provide project support for development personnel as new functions/processes are designed, written, tested and implemented.
Tools: Used: AWS, Amazon Redshift, Google Big Query, Salesforce, Marketo, Netezza, Python, R, SSIS, SSRS, SQL Server, Tableau, REST APIs, Facebook Graph API, Salesforce, Google Analytics, Tealium