A complete guide to Activation Functions used in Neural Networks
Artificial Intelligence (AI) is one of the most trending industries in 2018. AI is changing our world
forever. If you know about AI then you must have heard about Neural Networks. Neural
Networks is one of the most used and popular algorithms in AI.
In this article, I will talk about Activation Functions used in Neural Networks. Activation
Functions are very important in Neural Networks.

So, How Neural Networks work?
Artificial Neural Networks (ANN) are roughly based on our brains Neural Network. In ANNs,
multiple nodes are interconnected together to signals can pass through these nodes. Because
of these interconnected nodes, ANNs gives us amazing results.
To understand how NNs are, first, assume a 2 layer NN. That means an NN with one input
layer, 1 hidden layer, and one output layer.
Note — We don’t count the input layer.









First, we’ve some input as a Vector and, we feed that vector to the network. Then the network
performs matrix operations on that input vector to calculates the “weighted sum” of its input, add
bias and then finally apply some Activation Function and pass the value to the next layer. And
we keep repeating this process until we reach the last layer. This process is known as Forward
In the simplest form, to calculate the “weighted sum”, we use the following equation



The final out value is the prediction. And we use this prediction to calculate the error by
comparing the output with the label. We use the error value to calculate the partial derivative
w.r.t the weights and then update the weights with value. We keep repeating this process until
the error becomes very small. This process is known as Backward Propagation

This is how a Neural Network works. So now we understand about Neural Networks, so we can
jump in the Activation Function.

Note — I’m not going to deep about Forward Propagation and Back Propagation. If you don’t
have any idea about Forward Propagation and Back Propagation, then please learn these
topics first and then follow this post.

Use nouns as relative path with appropriate HTTP methods

Use a particular structure for every resource:

Method GET
/users Returns a list of persons Create a new person Bulk update of persons Delete all persons Update specific properties of different persons
/users /1 Returns a specific person Method not allowed (405) Updates a specific person Deletes a specific person Update specific properties of different person

Do not use verbs:

HTTP GET and query parameters should not alter the state
Use PUT, POST and DELETEmethods instead of the GET method to alter the state.
Do not use GET for state changes:
GET /persons/1?activate or
GET /persons/1/activate

Use plural nouns
Do not mix up singular and plural nouns. Keep it simple and use only plural nouns for all resources collection and use singular for single resource.

Use sub-resources for relations
If a resource is related to another resource use sub-resources.
GET /persons/1/childrens/ Returns a list of childrensfor person 1
GET /persons/1/childrens/1 Returns childrenr #1 for person 1

Logging into an application is a fundamental, crucial, and yet thankless experience. When it works correctly, most users gloss over its underlying complexity. It is not part of the core product experience but just an expectation.
While designing the authorization stack developers have to weigh security versus convenience in authentication approaches.
The most common kinds of authentication in a web application are:
Basic authentication – send credentials with every request
Token based authentication – send credentials first time to receive a token, then use the token for all subsequent requests
OAuth – it’s complicated

Basic Authentication is exactly what it implies – basic. In this method of authentication, the user credentials have to be passed along with every HTTP request, usually in the format of username:password as a Base64 encoded string. This is passed along in the Authorization header.

In Token Based Authentication the credentials have only to be supplied on login, upon which the auth server provides the client with a token signifying a valid user. All subsequent requests need to be supplied with the token to access protected resources.

React (also known as React.js or ReactJS) is a JavaScript library for building user interfaces. It is maintained by Facebook and a community of individual developers and companies.

React was created by Jordan Walke, a software engineer at Facebook. He was influenced by XHP, an HTML component framework for PHP. It was first deployed on Facebook’s newsfeed in 2011 and later on in 2012. It was open-sourced at JSConf US in May 2013.

Most developers choose to write React components in a JavaScript dialect known as JSX. It is essentially a mix of HTML and JavaScript. The tags need to be closed and there are some differences like using the attribute className instead of just class for defining CSS classes. JSX is transformed into JavaScript for manipulating the DOM.
For example JSX:

var HelloMessage = React.createClass({
render: function() {

Hello {}

ReactDOM.render(, mountNode);
is transformed to JavaScript by a JSX parser:
“use strict”;
var HelloMessage = React.createClass({
displayName: “HelloMessage”,
render: function render() {
return React.createElement(
“Hello “,
HelloMessage, { name: “John” }
), mountNode);

Now can we use React js with Php ??
Yes. This is possible. Reactjs is just the ‘V’ in MVC. React doesn’t care what you are using at backend. One can render React’s components on server side in PHP using V8Js PHP extension, but this is not necessary. Server side rendering in Reactjs is optional. Here are some things you can do:
1.Compile your whole reactjs JSX code using babel. It would be better if you make use of some module bundler like webpack and compile your reactjs code into a single file. Upload that single file on your server.
2.You can populate default states in your react code using php.
The best way to use PHP as backend with React Js as front end is to keep both seperate. Make a stand alone front-end and use PHP to create APIs which interacts with the database. Then consume the API through HTTP AJAX or whatever mechanism React Js contains.

Laravel is the most popular and widely used framework in PHP application development. It has gained immense popularity in a short period of time and proven its worth in Website Development, e-commerce Development and Web Development by emerging as the go to framework among developers. But is Laravel capable of proving its worth in the Enterprise Application Domain? Well, the answer would be definitely yes. Here are some of the reasons why Laravel can defiantly be used in Enterprise Applications.

Support for Rapid Development:
Laravel id based on the famous Model-View-Controller architecture and MVC architecture always increases the speed of the development process.

Eloquent Object Relation Mapper:
Eloquent Object Relation Mapper is an Active record implementation, which means the developer can deal with the database by using PHP rather than using lengthy SQL syntax.

Modular Approach:
Laravel has adopted the modular approach of software development process. What happens in modular approach is that you get readymade libraries to work with, which makes the job of developers very easy.

Migration for Database:
Laravel migration assists you to extend the structure of the database without the need to re-create it every time a change is made. It helps to secure the development data from any loss.

Multiple File System Support:
Laravel offers great support for multiple file system by using third party package file system to provide support for multiple files. Laravel provides options to use local or cloud-based storage to provide simple configuration.

Support for Unit Testing:
Testing is a very important aspect of application development and Laravel comes with the feature called Unit Testing that allows us to test our application.
Considering these powerful features that Laravel offers, it is quite clear that we can definitely use Laravel in Enterprise Application Development.

In today’s hyper competitive landscape, intense focus on customer-centricity has made retailers
to adopt business models focusing on customer loyalty to gain market leadership. Customer
engagement and retention being the core focus areas, loyalty programs are becoming
increasingly effective as strategic investments.

Unsurprisingly, consumers are inclined towards retailers and channels that offer maximum
value and preferential treatment, thereby rewarding them with purchase decision.
While the loyalty management market will be valued at USD 24.59 billion by 2021, many
organizations are still not able to meaningfully engage with their customers.
In a recent survey, some 57% members admitted they don’t know their reward points balance
and only 25% members are satisfied with the effort needed to earn a reward.
One possible solution could be to integrate different programs (across brands) into a consolidated
loyalty network—enabling customers to earn points from multiple schemes under one wallet that can
be used at multiple outlets. But, this addresses only part of the problem. Moreover, businesses are often
hesitant to share existing customers’ data. On the other hand, there are significant monetary
implications as well, not to mention the challenges pertaining to the integration of siloed systems and
databases, data security, and coordination of multiple intermediaries

Such a platform of interlinked loyalty programs opens up new business opportunities, both for
large and small operators. Large operators with already established programs can adopt new
service models and offer value-added services to other businesses, while small operators can
connect with other players in the industry and scale up their business.

TechVariable is a digital transformation and technology services company based out of Assam and
Bangalore, India. For the last 3 years, our core service has been to take an agile and collaborative
approach to create customized solutions across industries, helping enterprises to run businesses more
efficiently. Our clients range businesses from Chicago, Denmark, Dubai, Qatar, Sydney etc.
To know more, visit or email at

In enterprise grade applications and specifically in product data management, the main focus of PLM vendors was about how to manage CAD files and optimizing check-in/check-out process , managing BoM , process control and measurement. This was mainly driven by the scientific discipline of “Knowledge management” (This term appeared in early 1990s) which was to use software to manage knowledge base , decision support systems and other joined efforts. But most PLM systems failed to deliver anything beyond data records which are yet to be discovered, analyzed and connected.

Forbes article – How Artificial Intelligence Is Revolutionizing Enterprise Software is a good reminder that AI and ML is coming to enterprise space and we better get prepared how not to miss that opportunity.

We can classify manufacturing environment according to 3 main types – Make to stock (or build to stock), Assembly (Configure) to order and Make (Engineer) to order sometime called Build-to-order. These three types of manufacturing environments can bring different challenges for product lifecycle management and require different functions and capabilities. There has been a lot of research by PLM vendors in terms of how lifecycle of a BoM across product lifecycle can be done. There has been a lot of trial and error going on in the space of product configuration or variant management as well.

However, as futuristic it may sound, all the development across AI and learning space has made me think about future intersection of PLM and AI platforms. Manufacturing is becoming more connected these days. The relationships between OEMs and suppliers, contractors, different product configurations, demand, global manufacturing, etc. All together is a potential grid of information and options that cannot be digested and optimized by a human mind. There is a demand for “intelligent” platforms capable to make analysis and help people to make decisions. It is a moonshot, but I believe there will be a time where “intelligent” PLM system will be able to suggest/prescribe entire BoM structure along with possible involved people along the lifecycle. Just my thoughts…

TechVariable is now live at ‘Workshop on Blockchain Technology’ organised by the Govt. Of Assam & Directorate of Information Technology, Electronics and Communication.
Presenting a technical session on Medical Data Management on the Blockchain.

Blockchain has set the market on fire. It is a very new tech and has the potential to revolutionize the industry systems as we know it. If you are someone who likes to keep up with the new introductions to the cyber world, then chances are that you have heard about Blockchain tons of times. And why not? It has transcended its intended use and currently is evolving at a very fast pace. Every organization is busy in implementing Blockchain and modifying it further to suit their needs.