We provide IT Staff Augmentation Services!

Software Developer Resume

5.00/5 (Submit Your Rating)

San Jose, CA

Dear SirI have been working in software development and architecting for complex algorithms and data structure for Distributed Computing, Internet infrastructure and IC Design for last 15 years. My extensive experience and proficiency in multitude of area will be beneficial to your organization.

Please find the attached resume for your consideration.

For convenience resume is divided in two parts.

PART I: concise resume

PART II: the details of the projects done

Sincerely Alok

RESUME (part I)

VISA STATUS

US Citizen

CAREER GOAL

Looking for a challenging and rewarding technical leadership role that leverages my knowledge, skills, problem solving ability and analytical ability.

SUMMARY

  • About 15 years of experience in software development and architecting for complex algorithms and data structure for Distributed Computing, Internet infrastructure and IC Design.
  • Extensive experience and proficiency in C/C++,Java, Groovy, Perl,Tcl/Tk JavaScript, Verilog, VHDL,XML, Hadoop, Pig, Hive,lex/yacc, CAD tools, synopsys design compiler, P&R tool, HDL simulator.
  • Proficient knowledge and experience in designing and implementing -
  • Multithreaded applications for large data transfer workflows/framework building using messaging system, graphs
  • Distributed computing using hadoop,pig, hive mapped framework
  • Distributed storage using NoSql, key value database
  • Implementing large Perl modules for database manipulation
  • Unix network (TCP/IP) application programming
  • Leading and Working with symbolic equivalent checker to create glue applications leading to the patent filing.
  • Experience in Designing and implementing graph algorithms and data structures, circuit design database, multi-million transistor net lists, symbolic equivalent checker, compilers ( Verilog logic synthesizer), cost optimization problems, low level assembly programs.
  • Implementing GUI using Tcl/Tk, java(JDK).
  • Knowledge of Software Engineering, Distributed computing, IC Design, Operating. System and compiler constructions, Computer Architecture, Computer Networks, TCP/IP,SQL
  • Experience as technical lead mentoring team members and team player, working on project life cycle from beginning to end.

PROFESSIONAL EXPERIENCE

1. Confidential, San Jose, CA (Apr 2010-till present) Consultant

  • Implemented the statistical data mining rule association rule learning for the bad money transfer using IP, cookies, emails etc in hadoop, cascading and java.
  • Ported and created the framework of hadoop mapred classes optimized joins, groupby, vector mathematics for porting sql terradata jobs to hadoop using groovy/java.
  • Designed and developed the data mining algorithm: social network cluster ring using hadoop java mapred for the fraud detection from the massive data.
  • Part of the team implementing the data analysis and data mining like linking fraudster, clustering based on fuzzy clusters Levenshtein distance etc
  • Did a lot of hadoop admin works like setting up the clusters, monitoring and trouble shooting
  • Using java/groovy Created a workflow system similar to oozie using groovy and java technologies like messaging service, graphs, to automatically run, monitor, alerts etc to deploy the weekly statistical analysis jobs.
  • Made and implemented numerous architectural suggestion and development in the code to improve the runtime by many order of magnitude and almost 5x reduction in space.
  • Mentor and lead the China QA team.

2. Confidential, Sunnyvale,CA ( )

  • Implemented data analysis algorithms for SDS Targeting using Hadoop distributed framework.
  • Data extraction and formatting from various sql databases and internal yahoo database
  • Work on various consumer statistical models conversion.
  • Worked on many multithreaded tools, application for loading the large data into the compute grid.(ultRecord format, data conversion)
  • Helped other groups to designing, architecting the storage in the hadoop based columnar storage (zebra)
  • Created the restricted shell to include the compute grid security
  • Worked in the team responsible for anonymizing the person data for whole yahoo.

3. Confidential, Santa Clara, CA ( )

  • Designed, Implemented the charge share solutions for custom circuits
  • Based on the formal verification techniques, implemented the Signal relation module to handle complex constraints on the design using Cudd package and graph algorithms.
  • Design, Implemented the circuit database infrastructure and which can handle multimillion transistors and can theoretically handle terabytes of design.
  • Implemented the parallel processing solution for the circuit checker to improve the run time.
  • Performance improvements for the various in-house build tools (noise tools).

4. Confidential, Sunnyvale, CA ( )

  • Custom circuit's Symbolic Electrical Rule Checker (ERC).It's Cudd ADD (arithmetic decision diagram) based erc.The main functions included:
Custom circuit's classification and databases o Rewriting the whole code with the new database infrastructure o Bug fixings and enhancements. o Customer support to get more close to the design issues o Improvement in the current code to have better classifications o Logical abstraction from the custom circuits o Distributing computing to find the CPU intensive electrical design checks on the fly.
  • User maintains the dependency of the sub calculation and split the bigger design into smaller one and run it in different threads and machine and then recombines the results.

5. Confidential, Sunnyvale, CA ( )

  • Designed and Implemented complex graph based Transistor Analysis Algorithms for ERC(electrical rule checker) custom transistor circuits using the formal verification techniques.
  • Designed, Implemented the charge share solution based on ADD.
  • Worked with the team responsible for implementing activity based

power analysis for the logic gate.

  • Implemented the parasitics graph classification for the logic gates so that the power tool can also analysis the custom circuits.
  • Design and Implemented the tcl based glue for the transistor toolkit

6. Confidential, Mountain View, CA. (Consultant from )

  • Bug fixing for the verilog compiler for logic synthesis
  • Enhancement of the verilog compile (logic synthesizer) to support more synthesis construct like - Variable indexed read and write, Function call, Parameterized modules, Multipliers, Constant propagation, Inline assertion support etc
  • Rewrote the whole verilog logic synthesizer to make is more robust and with the use of design patterns.
  • Fixing bug/maintaining formal verification model checker.
  • Worked with the symbolic equivalent checker and verilog synthesizer to create the GUI based method for design navigation.

The relevant patent "Trace based method for design navigation" had been filed.

7. Confidential, Santa Clara, CA ( )

  • Maintaining and adding new features in the C profiler and fixing bugs in the c compiler.
  • Part of the team implementing the multi processor simulation using the Unix networking.
  • Using PERL/C/JavaScript-html/databases/tcp-networking made the Distributed System, that will schedule manage the server farms and run the available jobs across the various available machine in the server farm. It is similar to the LSF program. Also it maintains the database and user can search across various criteria using the web interface.
  • Using C++/STL and LEDA (similar to verilog PLI) made the constant propagation program. It reads the verilog rtl, and propagate the constant signal across the module boundaries and it used along with the commercial RTL verifying tool (HDL score).
  • Using C and LEDA enhance and work on the program to convert

the synthesizable verilog RTL to the synthesizable vhdl RTL

  • I am the part of the team doing the ASIC verification of the

configurable processor. I am mostly involved in the -

  • Writing and changing the diagnosis vectors for the
  • Test environment of the configurable microprocessors.
  • Writing directed test cases for Data Memory management unit and the execution unit.
  • Writing PLI routines.

8. Confidential, CA ( )

  • Develop Static Timing Analyzer for Logic Optimization Tool.
  • Maintain and Develop new algorithms for Buffer/Sequential Optimization
  • Using Java AWT components developed GUI for Synthesis Tool.
  • Using Object Oriented Tcl/TK with C develop/maintain GUI for P&R Design Tool
  • Using lex/yacc/C/ develop Frontend Compiler (parser/linker/Checker)
  • Writing the code generation routines of the backend Compilers, Implementing macro (adder/inc/mux/decoder) generator,
  • Using lex/yacc/C/C++ write various interface for the third party tool.
  • Develop the command shell interface using Tcl.
  • Write and integrate numerous complex perl programs to manipulate netlist
  • Write numerous synthesizable Verilog/VHDL design for internal use.
  • Did the complete ASIC flow from logic synthesis using Synopsys, static timing analysis for the customer design with the timing closure issues.

9. Confidential ( 7)

  • Use Verilog/VHDL for ASIC library development and writing synthesizable test design like 8 bit CPU, controller etc.
  • Use C/Perl to write utilities for verification and validation of designs

10. Confidential ( 6)

  • Involved in the development/QA of DSP568000 Simulator using C/C++.

DESCRIPTON OF THE PATENTS:

1) Patent:

Title: Trace based method for design navigation

Abstract:

A highlighting system for use with electronic circuit design tools is provided for displaying signal waveforms and Register Transfer

Logic (RTL) source code portions corresponding to a selected signal in the same window.

The user selects a time and signal to be explored. Based on the selected time and signal, the values of all related signals are identified from a database generated by simulation of RTL source code. Nodes corresponding to the related signals are identified from a gate-level netlist corresponding to the RTL source code and the nodes responsible for the particular value of the selected signal at selected time are identified. The nodes are then mapped on to the RTL source code portions by a process of Instrumentation. The RTL source code portions so identified are then displayed.

Responsibility: I was responsible for the co-creating the idea and doing the coding and design of the software

DESCRIPTION OF THE PROJECTS DONE

HADOOP AND DISTRIBUTED APPLICATIONS

(Using Hadoop, cascading, java, groovy, perl and databases)

Project: Data Analysis for financial risk: using Statistics, data mining and hadoop

Responsibility:

Profile:

1) Developed and implemented hadoop mapred code/framework for the statistical data mining association rule learning for the bad money transfer rates using different parameters like IP, cookies, emails etc

2) Design and Develop the hadoop programs for clustering the user to

- find a cluster network ring of potential fraudster of size (m=sender, n =receiver) using the database of the transactions between sender and receivers

-find a cluster of related users using fuzzy matching Levenstein distance of different parameters like emails etc

3) Part of the team designing, discussing and implementing the data analysis using graph based linking based on user rank.

Rank is the strength between various users on different parameters like shared cookies, shared IP, shared emails.

Knowledge gained: various data mining and classification algorithms and financial risk analysis and its implementation

Project: Workflow System for Hadoop

Responsibility: I was solely responsible for architecting the groovy/java based multithreaded workflow based on messaging service.

Profile:

1) It is much lighter weight than Oozie

2) I created a simple thread safe message service similar to JMS and used it to execute the various job flows and scheduling.

3) It manages the job failures, alerts and data ETL, data downloading and executing user's directed graph.

Knowledge gained: groovy, java, messaging service, design patterns, hadoop job scheduling etc

Project: ported Terradata SQL scripts to the hadoop mapred programs

Responsibility: I was the hadoop consultant hired for this

Profile:

1) To facilitate the complex query and to do special optimization, we bypass pig and hive and wrote the query in simple mapred

2) Along the way I designed and created a modular vector mathematics based Puma PayPal utils for mapred apps java modules.

- It contains various vector based statistical classes

- various helper classes like Join, Groupby Select etc

2) Puma package being used in other applications to facilitate faster time to develop the code

Knowledge gained: design patterns, software engineering, risk analysis, sql

Project: To perform hadoop admin and tools developments

Responsibility: setup and manage the hadoop cluster.

Profile:

1) Cluster Maintenance, Cluster deployment, Cluster Monitoring and Troubleshooting

2) Troubleshoots any performance issues that comes in during the lifecycle of the cluster

Knowledge gained: hadoop administration

Project: To create the behavioral targeting modeling platform for advertising.

Responsibility: part of the team implementing data acquisition and modeling using the distributed Hadoop framework.

Profile:

1) It creates the modeling/scoring platform from terabytes of data using the Hadoop distributed framework.

2) Involved some database searching and data acquisition.

3) Creating/converting between various models format.

Knowledge gained: databases, distributed applications (using java), software engineering.

Project: To create the daily terabytes (TB) of data loaded on the Hadoop compute grid.

Responsibility: I was solely designed architected the Perl based multithreaded system

Project: Various Java, Perl and C/C++ based utilities, tools and applications using Hadoop framework

Responsibility: Both as an individual contributor and as a technical lead

Profile:

1) XML loader for large data 2) Personal identifier anonymization from query and other logs

3) Utilities to do the data conversions using columnar storage table

4) Helped the ad-click joining team to port the application on Hadoop

Knowledge gained: problem solving in distributed and parallel computing.

DISTRIBUTED SYSTEM

(Using Perl, C, Unix-networking, database and html-JavaScript)

Project: To develop the Distributed system to launch the Unix/Linux over the various machines.

Responsibility: I was solely responsible for development and testing.

Profile:

1) It has the frontend language that let user write the parallel code using the perl5 syntax. It also let user specify the dependencies similar to makefiles.

2) It then assigns the priority and schedules the jobs depending on the dependencies and the position of the job node in the graph of network.

3) It then starts a daemon over all the client machine and server on the server machine.

4) The server sends and manages the job sending to the clients using the Berkeley database and network socket connection

5) All the jobs are then put in the database and then user can perform the search depending upon the various criteria. The search algorithm was the double indexing search. The search result was displayed using the web interface.

Knowledge gained: network application programming, databases, html/CGI and distributed application

TOOLS: FOR CUSTOM CIRCUITS

Project: To develop the database infra structure for the custom circuits

Responsibility: Design and implemented the whole infrastructure. Gave presentation and got approval for the usages depending on its performance.

Profile:

1) It can handle multimillion transistor netlist without the overhead of memory and CPU.

2) It can theoretically handle many terabytes of data without memory and CPU overhead.

3) It has an API that hides all the internal details and user feels like he is writing the code in the normal in memory application rather than the database applications.

Knowledge gained: Database design, software engineering.

Project: To Develop the core functionality so that it can be used for the creation of various ERC

Responsibility: Implemented Transistor Analysis Algorithms.

Profile:

  • It basically implemented the algorithms of the following papers

"Local Analysis of Linear Networks by Aggregation of Characteristic Lines" by Jakob Mauss

"CMOS Circuit Verification with Symbolic Switch-Level Timing Simulation" by Claytan B McDonald and Randal E. Bryant.

"An ADD-based Approach to Evaluating Charge Sharing in Custom CMOS Circuits"

2) Later the above algorithm was used to create following ERC. I helped the other team implement the following rules seamlessly: BetaRatio CcrBulkNodes ChargeShare DriveImpedance FanOut FloatingGate FloatingMosGate StackDepth StackPrecharger TransitionTime VoltageLevel XtrBulkNode XtrGeometry XtrVoltageDomain.

Knowledge gained: designing and leading the project. Various transistor algorithms, BDD

Project: To Develop the "Symbolic Timing/Functional Simulator for custom circuits".

Profile:

1) The project is in progress.

2) It is based on the PhD thesis of Claytan B McDonald

Knowledge gained: leading and designing large software and symbolic simulation techniques.

Project: Power Tools

Responsibility:

1) I was the part of a three-person team responsible for developing the activity based gate level power base.

2) My responsibility included software design and part of its Implementation in C++. Although algorithm was proposed by the lead, who initially designed the Perl based power tool.

3) Later I was designed, lead and implementing the power analysis for the custom circuits using the gate level power tool.

Profile:

1) We had cell characterized with the power data

2) The classification tool will classify the logic of the transistor into gate.

3) the power tool will use the logic gates and will call the Verilog simulator to find the dynamic activity of each net.

4) The commercial verilog simulator was linked with verilog PLI with the power tool via UNIX sockets to calculate the activity.

5) The total power was calculated using the power characterization of cell and the activity.

6) For doing the power analysis, we needed to classify the parasitic network of the spice netlist according to the logic partitions.

I designed and implemented the graph based algorithm for classifying the parasitic network based on the logic gate

Knowledge gained: understanding of the power analysis of CMOS circuits. Various power analysis technique used.

LOGIC OPTIMIZATION AND CONSTRAINED TIMING ANALYSIS

Project: To develop the static logic timing analyzer in C.

Responsibility: Team member of the team of 3 persons.

Profile:

1) It first traces the logic circuits graph and annotates the timing beginning and end points and generate DFS vector.

2) It then annotates the arrival time for the DFS nodes

3) It then annotated the required time for the reverse DFS nodes.

4) Slack graph is constructed and is passed to the logic optimizer to do the timing driven optimization.

5) The timing reporter will give the most critical path with respect to each clock.

Knowledge gained: sequential timing analysis, graph manipulation

Project: Maintenance

Responsibility: maintenance of the buffer / sequential optimization

Profile:

1) Buffer optimization included resizing, restructuring, fan-in/fan-out corrections by inserting buffers tree.

2) Sequential Optimization included retiming, removal of redundant Latch/Flip-Flop depending on the state.

Knowledge gained: Timing Driven buffer optimization.

FRONTEND COMPILER (PARSER/LINKER/CHECKER USING C/C++/LEX/YACC)

VERILOG LOGIC SYNTHESIZER

Project: To enhance the free v2clm RTL to alif netlist convertor

Responsibility: I initially joined the co-founder working on it and later owned it.

Profile:

Enhancement of the Verilog logic synthesizer to support more synthesis construct like - variable indexed read and write, function call, parameterized modules, multipliers, constant propagation, inline assertion support etc

I re-wrote the whole Verilog logic synthesizer to make is more robust and with the use of design patterns.

RTL VERIFICATION USING C++/STL

Project: To develop the constant propagator for doing RTL verification.

Responsibility: I was solely responsible for development and testing

Profile:

1) It uses C++/STL along with LEDA libraries to parse the Verilog RTL and create the hierarchical data structure.

2) It then builds the network of all the signal assignments that are combinational logic.

3) It then propagate the constant signal and signal vector across the module boundaries and generate the output in the form TOP.inst1.inst2.a = 2'b10;

4) It then split the bus between the module interfaces.

5) The output is used along with HDL score (commercial EDA software) to do the RTL verification

Project: Writing Verilog Test bench from the Perl modules.

Responsibility: I was part of team writing

1) Diagnostic cases for testing the fetch block

2) Verilog PLI for the enhancing simulation

C/C++ COMPILER/SIMULATOR

Project: Maintaining / Bug fixing the c profiler and c compiler

Responsibility: I am the part of the team working on the compiler group.

1) Fix and add new features for various profiling in the gprof.

2) Maintain/enhance the multi processor simulation for simulator.

Project: To develop the hardware module linker and semantic checker

Responsibility: I was solely responsible for development and testing

Profile:

1) It uses lex/yacc to parse the Berkeley alif netlist generated from the Verilog synthesizer

2) It then performs the various semantic checking and hardware checking like floating nets, illegal connections, recursion etc

3) It then links the sub circuits in the modules to build the complete C data structure. The functionalities were similar to C linker, which links functions and check for the arguments. These data structures were used for optimization and timing analysis.

Knowledge gained: compiler frontend, lex/yacc, C, data structure designs using graphs, trees, list, stack, queues, hash tables.

Project: To develop the compiler for synopsys libraries.

Responsibility: team member of group of 3 persons. I was responsible for writing the frontend of the Synopsys library

Profile:

1) It uses lex/yacc and C to parse and build the data structure.

2) The APIs were written for the other program interface.

Knowledge gained: lex/yacc, C/C++, data structure design like graphs, trees, list, queues, hash tables.

Project: To write/maintain the new and existing Parser

Responsibility: Responsible for writing the various parser/writer to convert one file format into another in C/C++ Knowledge gained: logic synthesis, graph traversal.

Project: Module generator

Responsibility: team member of the group of 2 persons responsible for synthesizing the macros like Mux tree, decoder tree, adder/incrementer etc.

Profile:

1) It uses the user specified constraints like area/timing/structure to generate netlist.

2) Mux and decoder were implemented using balance tree.

3) adder/ inc was implemented using the combination of carry look ahead/ripple/mux adder.

Knowledge gained: hardware macros.

MICROCODING (ASSEMBLY LANGUAGE)

Project: Micro coding for the DSP chip

Responsibility: Implementing various microcode modules for Motorola DSP chip 568000.

Profile:

The various basic routines developed in assembly code were -

1. Mathematical functions (SHL, SHR, MULT, DIV, FLOATING POINT OPERATION).

2. Filters (various FIRs and IIRs)

3. Cyclic redundancy code (CRC)

4. Fast Fourier Transform (FFT)

5. Modem Initialization.

Knowledge gained: general DSP architecture and assembly coding

USER INTERFACE AND GUI

Project: Develop GUI using Java and C.

Responsibility: In a team of two, my responsibility included -

1. Developing the GUI forms and

2. Attaching it with the C routines of the synthesis software.

Profile:

1) The GUI was developed on JDK. It was based on various java.awt components like menus, list, scrollbars, text, selectables, panels, etc.

2) This Java programs were integrated with the native C code

Knowledge gained: Gained Expertise in using the Java packages, OOP, C

Project: Maintain/Enhance GUI using TCL/TK and C

Responsibility: I am responsible for

1) Maintenance/Enhancement/Development of TCL/TK GUI

2) Writing the wrapper to interact with the C routines

Profile: It uses the Avanti 's object based library for TCL/TK, which is based on the standard tcl/tk scripts and windows

Knowledge gained: Gained expertise in TCL/TK and using them with the C routines.

Project: Develop Bug Tracker using TCL/TK

Responsibility: None

Profile:

1) It is the Tk/Tcl based GUI, which help application engineers and managers to keep tracks of the open bugs and issues.

2) It can be used to file and assign the bug to the owner and then setting the reminder frequency, to do the advanced searching. It continuously keeps track of the open bugs and sends the reminders to the owner so that he is constantly updated.

HARDWARE LANGUAGES: VHDL, VERILOG and C MODELING

Responsibility: Includes development and verification of small ASIC design and standard ASIC library cells in VITAL (VHDL), Verilog, and Synopsys

Profile:

1) Modeled and validating various modules on VITAL (VHDL) and Verilog Platform. The library-contained macros like single and dual port RAM apart from combinational and sequential standard cells.

2) Wrote and did the logic synthesis for various small designs (up to 5K gates) in verilog and VHDL. The designs included modules like 8 bit CPU, cache controller, multiplier, divisor etc.

3) Wrote various models in C to verify the functionality of the cells.

Knowledge gained: Gained Expertise in HDL (verilog/VHDL), Logic Synthesis and ASIC library.

PERL SCRIPTS

Responsibility: Wrote numerous Perl scripts (some as large as 5000 lines) for modifying netlist, writing utilities that are called from the main C routines.

Profile: A few notable scripts were

1) Canonical Form Generator: it uses the data structure of Perl 5 similar to tree for manipulating Berkeley alif netlist.

2) Macro Adder: This Perl script parses the logic equation and recognizes the adder cells and adds appropriate macros in the Berkeley alif netlist.

3) Netlist manipulator: This Perl script identifies various pad cells and adds various attributes for the optimization

4) Verilog Random Pattern Generator: given any Verilog netlist it will generate the test pattern.

We'd love your feedback!