Ad3303 Notes Unit 1
Ad3303 Notes Unit 1
Ad3303 Notes Unit 1
UNIT I – NODE.JS
⮚ Python: Known for its readability and versatility, Python is widely used in web development.
⮚ Ruby: Often associated with the Ruby on Rails framework, it emphasizes developer productivity.
⮚ Java: A versatile language used for a wide range of applications, including server-side
development.
⮚ C#: Primarily used with Microsoft technologies, it's a powerful language for building web
applications.
2. Web Frameworks: Web frameworks provide a foundation for server-side programming by offering
pre-built libraries, tools, and abstractions that simplify development. Some popular web frameworks
include:
⮚ Django (Python): A high-level framework that emphasizes rapid development and follows the
model-view-controller (MVC) architectural pattern.
⮚ Ruby on Rails (Ruby): An opinionated framework that promotes convention over configuration
and emphasizes developer productivity.
⮚ Laravel (PHP): A PHP framework known for its elegant syntax, expressive syntax, and rich
ecosystem.
⮚ Express.js (JavaScript/Node.js): A minimalist framework for building web applications using
JavaScript on the server-side.
3. Server-Side Architecture: Server-side programming typically involves designing and implementing the
architecture for handling incoming requests and generating appropriate responses. Key components of
server-side architecture include:
⮚ Routing: Defining routes to map incoming requests to specific functions or methods in the
server-side code.
⮚ Data Persistence: Interacting with databases or other storage systems to retrieve, update, or store
data.
⮚ Business Logic: Implementing the core logic and algorithms that process requests and generate
responses.
⮚ Security: Implementing authentication, authorization, and other security measures to protect
sensitive data and resources.
⮚ Integration: Connecting with external services or APIs to perform tasks such as payment
processing or sending emails.
4. Dynamic Content Generation: Server-side programming enables the creation of dynamic web pages or
applications. By generating content on the server before delivering it to the client, developers can create
personalized and interactive experiences for users. This can involve dynamically generating HTML,
rendering templates, interacting with databases, or consuming external APIs.
5. Client-Server Communication: Server-side programming involves handling client requests and sending
appropriate responses. This can include handling HTTP methods like GET, POST, PUT, and DELETE,
processing form data, handling file uploads, and sending JSON or XML responses.
6. Scalability and Performance: As server-side applications handle multiple concurrent requests,
scalability and performance are crucial considerations. Techniques like caching, load balancing,
asynchronous programming, and database optimization are commonly employed to ensure efficient and
responsive server-side applications.
Why Server Side Programming
Though it is technically feasible to implement almost any business logic using client-side
programs, logically or functionally it serves no purpose when it comes to enterprises application
(e.g., banking, air ticketing, e-shopping, etc). To further explain, going by the client-side
programming logic; a bank having 10,000customers would mean that each customer would have
a copy of the program(s) in his or her PC (and now even mobiles) which translates to 10,000
programs! In addition, there are issues like security, resource pooling, concurrent access and
manipulations to the database which simply cannot be handled by client-side programs. The
answer to most of the issues cited above is- “Server-Side Programming”. Figure illustrates the
Server-Side Architecture in the simplest way.
iii. Migrating to newer versions, architectures, design patterns, adding patches, witching to new
databases can be done at the server side without having to bother about client’s hardware or
software capabilities.
iv. Issues relating to enterprise applications like resource management, concurrency, session
management, security and performance are managed by the server side applications.
v. They are portable and possess the capability to generate dynamic and user-based content (e.g.,
displaying transaction information of credit card or debit card depending on user’s choice).
Types of Server-Side Programs
The following are the different types of server-side programs:
1. Active Server Pages (ASP)
2. Java Servlets
3. Java Server Pages (JSP)
4. Enterprise JavaBeans (EJB)
5. PHP
Node.js, often referred to as just Node, is a powerful tool that can run JavaScript
applications on both the server side as well as the client side. Node.js can be used to write static
file servers, Web application frameworks, messaging middleware, and servers for HTML5
multiplayer games. This article is a very elementary introduction to Node.js.
MULTI-TIER ARCHITECTURE
Multi-tier architecture provides a structured approach to designing and developing complex
applications. It promotes separation of concerns, scalability, reusability, and security. By dividing
the application into layers, developers can focus on specific areas of functionality and create
applications that are easier to develop, maintain, and scale.
The three-tier architecture is a design pattern that separates applications into web layers (client
layers), application layers, and database layers. Each layer has a specific responsibility and
communicates with the other layers through well-defined interfaces. Since it’s the most basic and
popular design, this post will take a look at the three-tier architecture with simple code.
Web Layer
In the web layer, we’re typically dealing with the front-end portion of our application, which
includes the client (e.g., web browser) and the server-side code that handles incoming requests and
sends back responses.
In the code above, we create a new HTTP server using the http module that's built into Node.js.
We then listen for incoming requests on port 3000 and respond to each request with a plain text
message of "Hello World!".
Application Layer
✔ The application layer is where we typically handle the business logic of our application. In a
Node.js application, we might use an MVC architecture to separate the concerns of our
application into different components.
✔ Define controller functions that encapsulate the application logic.
The user interacts with the front-end part of a web application. The front-end is usually
developed using languages like HTML and CSS styles, along with extensive usage of
JavaScript-based frameworks like ReactJS and Angular, which help with application design.
Server
The server is responsible for taking the client requests, performing the required tasks, and
sending responses back to the clients. It acts as a middleware between the front-end and stored
data to enable operations on the data by a client. Node.js, PHP, and Java are the most popular
technologies in use to develop and maintain a web server.
Database
The database stores the data for a web application. The data can be created, updated, and deleted
whenever the client requests. MySQL and MongoDB are among the most popular databases used
to store data for web applications.
▪ Requests
Incoming requests can be blocking (complex) or non-blocking (simple), depending upon the
tasks that a user wants to perform in a web application
▪ Node.js Server
Node.js server is a server-side platform that takes requests from users, processes those requests,
and returns responses to the corresponding users
▪ Event Queue
Event Queue in a Node.js server stores incoming client requests and passes those requests one-
by-one into the Event Loop
▪ Thread Pool
Thread pool consists of all the threads available for carrying out some tasks that might be
required to fulfill client requests
▪ Event Loop
Event Loop indefinitely receives requests and processes them, and then returns the responses to
corresponding clients
▪ External Resources
External resources are required to deal with blocking client requests. These resources can be for
computation, data storage, etc.
A single thread from the Thread Pool is assigned to a single complex request. This thread is
responsible for completing a particular blocking request by accessing the external resources,
such as compute, database, file system, etc.
Once, the task is carried out completely, the response is sent to the Event Loop that in turn sends
that response back to the Client
Advantages of Node.js Architecture
Node.js Architecture comes with several advantages that give the server-side platform a distinct
upper-hand when compared to other server-side languages:
With the use of Event Queue and Thread Pool, the Node.js server enables efficient handling of a
large number of incoming requests.
Event Loop handles all requests one-by-one, so there is no need to create multiple threads.
Instead, a single thread is sufficient to handle a blocking incoming request.
Node.js server, most of the time, requires fewer resources and memory due to the way it handles
the incoming requests. Since the requests are processed one at a time, the overall process
becomes less taxing on the memory.
All of these advantages contribute to making the servers developed using Node.js much faster
and responsive when compared to those developed using other server development technologies.
NPM (Node Package Manager)
NPM stands for Node Package Manager. It is the Node.js default package manager and is
completely written in Javascript. NPM is a command-line client for Node.js that manages all of
the dependencies and modules. With the installation of Node.js, it is added to the system. Any
required packages and modules in node projects are installed using NPM. A package contains all
of the files required for a module, and modules are JavaScript libraries that can be included
in a Node project based on the project's requirements. It includes a large number of libraries that
serve as excellent tools for Node.js developers, speeding up the entire application development
process. NPM can use the package.json file to install all of a project's dependencies. It has the
ability to upgrade and remove packages. Each dependency may specify a range of valid versions
using the semantic versioning scheme in the package.json file, allowing developers to
auto-update their packages while avoiding unwanted breaking changes.
Some basic commands:-
● npm i <packageName> :- installs a local package.
● npm i -g <packageName> :- installs a global package.
● npm un <packageName> :- uninstalls a local package.
● npm up :- updates the packages.
● npm t:- runs the tests.
● npm ls :- lists all the installed modules.
● npm ll :- prints additional package information while listing modules.
Uses of NPM:-
⮚ It aids in the download of a number of standalone tools that can be used right away.
⮚ NPM is frequently used by developers to share their code with other NPM users all over
the world.
⮚ It helps in managing and maintaining various versions of codes and their dependencies.
⮚ NPM automatically updates the application when the underlying codes are updated.
⮚ It functions as a community where you can connect with other developers who are
working on similar projects or tasks.
❖ Package:-
A package must contain a package.json file in order to be published to the npm registry.
Packages can be unscoped or scoped to a user or organization, and scoped packages can be
private or public.
A package is any of the following:
a) A folder containing a program described by a package.json file.
b) A zipped tarball containing (a).
c) A URL that resolves to (b).
d) A <name>@<version> that is published on the registry with (c).
e) A <name>@<tag> that points to (d).
f) A <name> that has a latest tag satisfying (e).
g) A git url that, when cloned, results in (a).
❖ Module :-
From the official documentation, a module is any file or directory in the node_modules directory
that can be loaded by the Node.js require() function. To be loaded by the Node.js require()
function, a module must be one of the following:
1. A folder with a package.json file containing a "main" field.
2. A JavaScript file.
Since modules are not required to have a package.json file, not all modules are packages. Only
modules that have a package.json file are packages. In the context of a Node program, the
module is also the thing that was loaded from a file.
For example, in the following program:
var req = require('request')
We can say that the variable ‘req’ refers to the ‘request’ module.
❖ Dependency hell:-
Let us assume that there are three modules : X, Y and Z. X requires Y at v3.0, and Z also
requires Y, but at v4.0. We can visualize this like so:
Let us assume that we have an application that requires both module X and module Z.
❖ The order in which things are installed will change the node modules directory tree
structure, since dependency resolution is dependent on installation order.
Let us assume we have a module X. X requires Y. Now lets create an application that requires
module X. On npm install, npm version 3 will install both module X and its dependency, module
Y, inside the /node_modules directory, flat. In npm version 2 this would have happened in a
nested way.
Let us suppose that we want to require another module, Z. Z requires Y, but at another version
than X.
Since Y v1.0 is already a top-level dependency, we cannot install Y v2.0 as a top level dependency. npm
v3 handles this by behaving in a similar fashion to npm v2 and nesting the new, different, module Y
version dependency under the module that requires it, that is, module Z.
Installation of NPM:-
To install NPM, it is required to install Node.js as NPM gets installed with Node.js automatically.
Versions of npm installed on the system can be checked using the following command:
npm - v.
If the version is not the latest one, we can install it using the command:
npm npm@latest -g
Creating a Node Project:-
To create a new node project, we use npm init in the folder in which we want to create the
project. A number of questions like name, license, scripts, description, author, keywords, version
and main file will be asked. After the project is created, a package.json file will be created in the
project folder which confirms that the project has been initialized.
Installation of Packages and Modules:-
Following the creation of the project, the next step is to add the packages and modules that will
be used in the Node Project. Use the following code to install packages and modules in the
project:
npm install package_name
Example:
To install the mongoose package in our project, we use the following command :
npm install mongoose
We can add an extra -g tag to the package installation syntax to make it global, that is, accessible
by all projects in the system.
Controlling the versions of the packages to be installed:-
⮚ In the package.json file, specify the complete and exact version to install a package of a
particular version.
The caret symbol (^) tells the npm to find a version greater than <version Number> and
install it.
⮚ Mention "*" in front of the dependency or "latest" to install the most recent version of the
package. This will locate and install the most recent stable version of the module.
Controlling where the packages and the modules get installed:-
We can add the –save flag to install a package while also saving it to the package.json file.
Because the –save flag is set by default in the npm install command, it is equivalent to npm
install package name.
–save-prod: Installing this package causes it to appear in Dependencies, which is also the
default.
–save-dev: This package will appear in devDependencies and will only be used in
development mode if you use it.
Simply type npm install in the terminal if there is already a package.json file with all the
packages listed as dependencies. NPM will examine the package.json file and install all of the
dependencies in the order in which they are listed in the file. When a Node project is forked and
cloned, this command is usually used.
NPM installs dependencies in local mode (by default), to the node modules directory of the Node
application folder.
Use the npm ls command to see all the locally installed modules.
Uninstalling Packages:-
To uninstall packages using npm, type the command:
npm uninstall
To uninstall global packages, type the command:
npm uninstall package_name -g
Some Node.Js Frameworks: -
- Express.js - Express for Everyone
- Koa.js - Next Generation Node.js
- Framework Meteor.js - One Application, One Language
- Socket.io - Chat Apps Made Easy with Socket.io
DEVELOPMENT ENVIRONMENT
One way of doing this is to install everything into $HOME/local/$PACKAGE. Here is how to
install
./configure --prefix=$HOME/local/node-v0.4.5 && make install
To have my paths automatically set I put this inside my $HOME/.zshrc:
PATH="$HOME/local/bin:/opt/local/bin:/usr/bin:/sbin:/bin"
LD_LIBRARY_PATH="/opt/local/lib:/usr/local/lib:/usr/lib"
for i in $HOME/local/*; do
Now that we have our image built, let’s run it as a container and test that it’s working. docker run
--rm -p 8081:8081 --name notes notes-service
From this error, we can see we’re having trouble connecting to the mongodb. Two things are
broken at this point:
1. We didn’t provide a connection string to the application.
2. We don’t have MongoDB running locally.
To resolve this, we could provide a connection string to a shared instance of our database, but we
want to be able to manage our database locally and not have to worry about messing up our
colleagues’ data they might be using to develop.
Step 3: Run MongoDB in a localized container
Instead of downloading MongoDB, installing, configuring, and then running the Mongo database
service. We can use the Docker Official Image for MongoDB and run it in a container.
Before we run MongoDB in a container, we want to create a couple of volumes that Docker can
manage to store our persistent data and configuration. I like to use the managed volumes that
Docker provides instead of using bind mounts. You can read all about volumes in our
documentation.
Creating volumes for Docker
To create our volumes, we’ll create one for the data and one for the configuration of MongoDB.
docker volume create mongodb
docker volume create mongodb_config Creating a user-defined bridge network
Now we’ll create a network that our application and database will use to talk with each other.
The network is called a user-defined bridge network and gives us a nice DNS lookup service that
we can use when creating our connection string.
docker network create mongodb
Now, we can run MongoDB in a container and attach it to the volumes and network we created
above. Docker will pull the image from Hub and run it for you locally.
docker run
-it --rm
-d -v mongodb:/data/db
-v mongodb_config:/data/configdb
-p 27017:27017
--network mongodb --name mongodb mongo
In Part I, we took a look at creating Docker images and running containers for Node.js
applications. We also took a look at setting up a database in a container and how volumes and
networks play a part in setting up your local development environment.
In Part II, we’ll take a look at creating and running a development image where we can compile,
add modules and debug our application all inside of a container. This helps speed up the
developer setup time when moving to a new application or project. In this case, our image should
have Node.js installed as well as NPM or YARN.
We’ll also take a quick look at using Docker Compose to help streamline the processes of setting
up and running a full microservices application locally on your development machine.
Let’s create a development image we can use to run our Node.js application.
Step 1: Develop your Dockerfile
Create a local directory on your development machine that we can use as a working directory to
save our Dockerfile and any other files that we’ll need for our development image.
$ mkdir -p ~/projects/dev-image
Create a Dockerfile in this folder and add the following commands. FROM node:18.7.0
RUN apt-get update && apt-get install -y \ nano \
Vim
We start off by using the node:18.7.0 official image. I’ve found that this image is fine for
creating a development image. I like to add a couple of text editors to the image in case I want to
quickly edit a file while inside the container.
We did not add an ENTRYPOINT or CMD to the Dockerfile because we will rely on the base
image’s ENTRYPOINT, and we will override the CMD when we start the image.
Step 2: Build your Docker image
Let’s build our image.
$ docker build -t node-dev-image . And now we can run it.
$ docker run -it --rm --name dev -v $(pwd):/code node-dev-image bash
You will be presented with a bash command prompt. Now, inside the container, we can create a
JavaScript file and run it with Node.js.
Step 3: Test your image
Run the following commands to test our image.
$ node -e 'console.log("hello from inside our container")' hello from inside our container
If all goes well, we have a working development image. We can now do everything that we
would do in our normal bash terminal.
If you run the above Docker command inside of the notes-service directory, then you will have
access to the code inside of the container. You can start the notes-service by simply navigating to
the /code directory and running npm run start.
Step 4: Use Compose to Develop locally
The notes-service project uses MongoDB as its data store. If you remember from Part I, we had
to start the Mongo container manually and connect it to the same network as our notes- service.
We also had to create a couple of volumes so we could persist our data across restarts of our
application and MongoDB.
Instead, we’ll create a Compose file to start our notes-service and the MongoDb with one
command. We’ll also set up the Compose file to start the notes-service in debug mode. This way,
we can connect a debugger to the running node process.
Open the notes-service in your favorite IDE or text editor and create a new file named
docker-compose.dev.yml. Type the py and paste the below commands into the file.
1 services:
2 notes:
3 build:
4 context: . ports:
5 - 8080:8080
6 - 9229:9229
7 environment:
8 SERVER_PORT=8080
9 - DATABASE_CONNECTIONSTRING=mongodb://mongo:27017/notes volumes:
10 ./:/code
11 command: npm run debug
12 mongo:
13 image: mongo:4.2.8 ports:
14 - 27017:27017
15 volumes:
16 - mongodb:/data/db
17 - mongodb_config:/data/configdb volumes:
18 mongodb:
19 Mongodb_config:
This compose file is super convenient because now we don’t have to type all the parameters to
pass to the `docker run` command. We can declaratively do that in the compose file.
We are exposing port 9229 so that we can attach a debugger. We are also mapping our local
source code into the running container so that we can make changes in our text editor and have
those changes picked up in the container.
One other really cool feature of using the compose file is that we have service resolution setup to
use the service names. As a result, we are now able to use “mongo” in our connection string. We
use “mongo” because that is what we have named our mongo service in the compose file.
Let’s start our application and confirm that it is running properly.
$ docker compose -f docker-compose.dev.yml up --build
We pass the “–build” flag so Docker will compile our image and then start it. If all goes well,
you should see the logs from the notes and mongo services
Now let’s test our API endpoint. Run the following curl command:
$ curl --request GET --url http://localhost:8080/services/m/notes You should receive the
following response:
{"code":"success","meta":{"total":0,"count":0},"payload":[]}
Step 5: Connect to a Debugger
We’ll use the debugger that comes with the Chrome browser. Open Chrome on your machine,
and then type the following into the address bar. The following screen will open.
About:inspect
Click the “Open dedicated DevTools for Node” link. This will open the DevTools that are
connected to the running Node.js process inside our container.
Let’s change the source code and then set a breakpoint.
Add the following code to the server.js file on line 19 and save the file.
Conclusion
In this article, we completed the first steps in Dockerizing our local development environment
for Node.js. Then, we took things a step further and created a general development image that
can be used like our normal command line. We also set up our compose file to map our source
code into the running container and exposed the debugging port.
REST stands for REpresentational State Transfer. REST is web standards based architecture and
uses HTTP Protocol. It revolves around resource where every component is a resource and a
resource is accessed by a common interface using HTTP standard methods. REST was first
introduced by Roy Fielding in 2000.
A REST Server simply provides access to resources and REST client accesses and modifies the
resources using HTTP protocol. Here each resource is identified by URIs/ global IDs. REST
uses various representations to represent a resource like text, JSON, XML but JSON is the most
popular one.
HTTP methods
Following four HTTP methods are commonly used in REST based architecture.
● GET − This is used to provide a read only access to a resource.
● PUT − This is used to create a new resource.
● DELETE − This is used to remove a resource.
● POST − This is used to update a existing resource or create a new resource.
List Users
Let's implement our first RESTful API listUsers using the following code in a server.js file −
server.js
: GET on local machine using any REST client. This should produce following result −
You can change given IP address when you will put the solution in production environment.
{
"user1" : {
"name" : "mahesh", "password" : "password1", "profession" : "teacher", "id": 1
},
"user2" : {
"name" : "suresh", "password" : "password2", "profession" : "librarian", "id": 2
},
"user3" : {
"name" : "ramesh", "password" : "password3", "profession" : "clerk", "id": 3
}
}
Add User
Following API will show you how to add new user in the list. Following is the detail of the new
user −
user = { "user4" : {
"name" : "mohit", "password" : "password4", "profession" : "teacher", "id": 4
}
}
You can accept the same input in the form of JSON using Ajax call but for teaching point of
view, we are making it hard coded here. Following is the addUser API to a new user in the
database −
server.js
})
Now try to access defined API using URL: http://127.0.0.1:8081/addUser and HTTP Method
: POST on local machine using any REST client. This should produce following result −
{
"user1":{"name":"mahesh","password":"password1","profession":"teacher","id":1},
"user2":{"name":"suresh","password":"password2","profession":"librarian","id":2},
"user3":{"name":"ramesh","password":"password3","profession":"clerk","id":3},
"user4":{"name":"mohit","password":"password4","profession":"teacher","id":4}
}
Show Detail
Now we will implement an API which will be called using user ID and it will display the detail
of the corresponding user.
server.js
Now try to access defined API using URL: http://127.0.0.1:8081/2 and HTTP Method : GET on
local machine using any REST client. This should produce following result −
console.log( data );
res.end( JSON.stringify(data));
});
})
var server = app.listen(8081, function () { var host = server.address().address
var port = server.address().port
console.log("Example app listening at http://%s:%s", host, port)
})
Now try to access defined API using URL: http://127.0.0.1:8081/deleteUser and HTTP Method :
DELETE on local machine using any REST client. This should produce following result −
{"user1":{"name":"mahesh","password":"password1","profession":"teacher","id":1},
"user3":{"name":"ramesh","password":"password3","profession":"clerk","i}}