Top 50 JavaScript & Node.js Interview Questions and Answers – Beginner to Advanced
Introduction
JavaScript is the world's most widely used programming language, and Node.js has made it a first-class citizen on the server side. Whether you are applying for a frontend, backend, or full-stack role, JavaScript interview questions are unavoidable — and Node.js questions are increasingly common even for frontend positions.
This guide walks through 50 carefully selected questions grouped by topic, each with a thorough explanation and real code examples you can run yourself. Difficulty labels help you gauge what level each question targets, and the tips section at the end tells you exactly what interviewers are looking for beyond the textbook answer.
Section 1 – JavaScript Core Concepts
var, let, and const? Beginner| Feature | var | let | const |
|---|---|---|---|
| Scope | Function-scoped | Block-scoped | Block-scoped |
| Hoisting | Hoisted & initialized to undefined | Hoisted but NOT initialized (TDZ) | Hoisted but NOT initialized (TDZ) |
| Re-declaration | Allowed | Not allowed | Not allowed |
| Re-assignment | Allowed | Allowed | Not allowed |
| Global object property | Yes (window.x) | No | No |
// var is function-scoped
function testVar() {
if (true) {
var x = 10;
}
console.log(x); // 10 — var leaks out of the if block
}
// let is block-scoped
function testLet() {
if (true) {
let y = 20;
}
console.log(y); // ReferenceError: y is not defined
}
// const cannot be reassigned
const PI = 3.14;
PI = 3; // TypeError: Assignment to constant variable
// BUT: const objects can be mutated
const user = { name: "Saiful" };
user.name = "John"; // This works — we mutated the object, not the binding
user = {}; // TypeError — this tries to reassign the binding
const by default. Switch to let only when you need to reassign. Never use var in modern JavaScript.A closure is a function that remembers the variables from its outer scope even after the outer function has finished executing. Closures are one of the most powerful and most tested JavaScript concepts.
function makeCounter() {
let count = 0; // this variable is "enclosed"
return function() {
count++;
return count;
};
}
const counter = makeCounter();
console.log(counter()); // 1
console.log(counter()); // 2
console.log(counter()); // 3
// count is not accessible from outside
console.log(count); // ReferenceError
Real-world uses of closures:
- Data encapsulation / private variables
- Factory functions
- Memoization / caching
- Event handlers and callbacks
- Partial application and currying
// Closure for private state
function createBankAccount(initialBalance) {
let balance = initialBalance; // private
return {
deposit: (amount) => { balance += amount; },
withdraw: (amount) => { balance -= amount; },
getBalance: () => balance
};
}
const account = createBankAccount(1000);
account.deposit(500);
console.log(account.getBalance()); // 1500
console.log(account.balance); // undefined — truly private
Hoisting is JavaScript's behavior of moving declarations to the top of their scope during the compilation phase, before code executes. Only the declaration is hoisted, not the initialization.
// Function declarations are fully hoisted
console.log(greet("Saiful")); // "Hello, Saiful" — works before declaration
function greet(name) {
return `Hello, ${name}`;
}
// var declarations are hoisted but initialized to undefined
console.log(age); // undefined (not ReferenceError)
var age = 25;
console.log(age); // 25
// let and const are hoisted but NOT initialized — Temporal Dead Zone (TDZ)
console.log(city); // ReferenceError: Cannot access 'city' before initialization
let city = "Dhaka";
let or const variable is declared. Accessing a variable in its TDZ throws a ReferenceError.== and ===? Beginner== performs loose equality with type coercion — JavaScript tries to convert both values to the same type before comparing. === performs strict equality — no type conversion, both value and type must match.
console.log(0 == false); // true (false coerces to 0)
console.log(0 === false); // false (different types)
console.log("" == false); // true (both coerce to 0)
console.log("" === false); // false
console.log(null == undefined); // true (special case)
console.log(null === undefined); // false
console.log(1 == "1"); // true ("1" coerces to 1)
console.log(1 === "1"); // false
=== in production code. The coercion rules of == are notoriously confusing and a common source of bugs.null and undefined? Beginnerundefined— a variable has been declared but not assigned a value. JavaScript sets this automatically.null— an intentional absence of value. A developer explicitly sets this to signal "no value here".
let a;
console.log(a); // undefined (declared, not assigned)
console.log(typeof a); // "undefined"
let b = null;
console.log(b); // null (intentionally empty)
console.log(typeof b); // "object" — famous JS quirk/bug
// Checking for both
function process(value) {
if (value == null) { // true for both null and undefined
return "no value";
}
return value;
}
call, apply, and bind? IntermediateAll three methods control the value of this inside a function, but they differ in how arguments are passed and when the function executes.
| Method | Executes Immediately? | Arguments |
|---|---|---|
call | Yes | Passed individually: fn.call(ctx, a, b) |
apply | Yes | Passed as array: fn.apply(ctx, [a, b]) |
bind | No — returns new function | Passed individually, executed later |
const person = { name: "Saiful" };
function introduce(role, company) {
return `I'm ${this.name}, a ${role} at ${company}`;
}
// call — execute immediately, args individually
console.log(introduce.call(person, "Developer", "TriksBuddy"));
// apply — execute immediately, args as array
console.log(introduce.apply(person, ["Developer", "TriksBuddy"]));
// bind — returns a new function for later use
const boundIntroduce = introduce.bind(person, "Developer");
console.log(boundIntroduce("TriksBuddy")); // call later with remaining args
JavaScript uses prototypal inheritance — every object has an internal link to another object called its prototype. When you access a property, JavaScript looks at the object first, then walks up the prototype chain until it finds the property or reaches null.
// Constructor function approach
function Animal(name) {
this.name = name;
}
Animal.prototype.speak = function() {
return `${this.name} makes a sound`;
};
function Dog(name, breed) {
Animal.call(this, name); // inherit properties
this.breed = breed;
}
Dog.prototype = Object.create(Animal.prototype); // inherit methods
Dog.prototype.constructor = Dog;
Dog.prototype.bark = function() {
return `${this.name} barks!`;
};
const dog = new Dog("Rex", "Labrador");
console.log(dog.speak()); // "Rex makes a sound" (inherited from Animal)
console.log(dog.bark()); // "Rex barks!" (own method)
// Modern class syntax (syntactic sugar over the above)
class Cat extends Animal {
meow() {
return `${this.name} meows!`;
}
}
const cat = new Cat("Whiskers");
console.log(cat.speak()); // inherited
console.log(cat.meow()); // own
map, filter, and reduce? Beginner| Method | Returns | Use When |
|---|---|---|
map | New array of same length | Transforming each element |
filter | New array (possibly shorter) | Selecting elements that match a condition |
reduce | Single accumulated value | Aggregating array into one value |
const products = [
{ name: "Laptop", price: 1200, inStock: true },
{ name: "Mouse", price: 25, inStock: false },
{ name: "Keyboard", price: 80, inStock: true },
{ name: "Monitor", price: 400, inStock: true }
];
// map — transform to array of names
const names = products.map(p => p.name);
// ["Laptop", "Mouse", "Keyboard", "Monitor"]
// filter — only in-stock products
const inStock = products.filter(p => p.inStock);
// [{Laptop...}, {Keyboard...}, {Monitor...}]
// reduce — total price of in-stock items
const total = products
.filter(p => p.inStock)
.reduce((sum, p) => sum + p.price, 0);
// 1680
Both use the ... syntax but serve opposite purposes:
- Spread — expands an iterable into individual elements
- Rest — collects remaining elements into an array
// Spread — expanding
const arr1 = [1, 2, 3];
const arr2 = [4, 5, 6];
const combined = [...arr1, ...arr2]; // [1, 2, 3, 4, 5, 6]
// Copy an object without mutation
const original = { name: "Saiful", role: "Developer" };
const updated = { ...original, role: "Senior Developer" };
// Rest — collecting remaining arguments
function sum(first, second, ...rest) {
console.log(first); // 1
console.log(second); // 2
console.log(rest); // [3, 4, 5]
return first + second + rest.reduce((a, b) => a + b, 0);
}
sum(1, 2, 3, 4, 5); // 15
Destructuring allows you to unpack values from arrays or properties from objects into distinct variables in a single, clean expression.
// Object destructuring
const user = { name: "Saiful", age: 30, city: "Dhaka" };
const { name, age, city = "Unknown" } = user;
console.log(name, age, city); // "Saiful" 30 "Dhaka"
// Rename while destructuring
const { name: userName, age: userAge } = user;
// Array destructuring
const [first, second, , fourth] = [10, 20, 30, 40];
console.log(first, second, fourth); // 10 20 40
// Destructuring in function parameters
function displayUser({ name, role = "User", age }) {
return `${name} (${role}), age ${age}`;
}
displayUser({ name: "Saiful", age: 30 }); // "Saiful (User), age 30"
// Swapping variables
let a = 1, b = 2;
[a, b] = [b, a];
console.log(a, b); // 2 1
A shallow copy copies only the top-level properties — nested objects are still referenced, not duplicated. A deep copy recursively copies all levels, creating a completely independent object.
const original = {
name: "Saiful",
address: { city: "Dhaka", country: "Bangladesh" }
};
// Shallow copy — nested object is still shared
const shallow = { ...original };
shallow.address.city = "Chittagong";
console.log(original.address.city); // "Chittagong" — original was mutated!
// Deep copy — using structuredClone (modern, built-in)
const deep = structuredClone(original);
deep.address.city = "Sylhet";
console.log(original.address.city); // "Dhaka" — original is safe
// Alternative: JSON (has limitations — loses functions, Dates, undefined)
const jsonCopy = JSON.parse(JSON.stringify(original));
structuredClone() was added to JavaScript in 2022 and is now the recommended way to deep copy objects in modern environments.A Higher-Order Function (HOF) is a function that either takes one or more functions as arguments, or returns a function as its result. map, filter, and reduce are all higher-order functions.
// HOF that takes a function as argument
function applyTwice(fn, value) {
return fn(fn(value));
}
const double = x => x * 2;
console.log(applyTwice(double, 5)); // 20 (5 → 10 → 20)
// HOF that returns a function (function factory)
function createMultiplier(multiplier) {
return (number) => number * multiplier;
}
const triple = createMultiplier(3);
const quadruple = createMultiplier(4);
console.log(triple(5)); // 15
console.log(quadruple(5)); // 20
Memoization is an optimization technique that caches the results of expensive function calls and returns the cached result for the same inputs — avoiding redundant computations.
// Without memoization — recalculates every time
function fibonacci(n) {
if (n <= 1) return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}
// fibonacci(40) is very slow — exponential time complexity
// With memoization
function memoize(fn) {
const cache = new Map();
return function(...args) {
const key = JSON.stringify(args);
if (cache.has(key)) {
console.log(`Cache hit for: ${key}`);
return cache.get(key);
}
const result = fn.apply(this, args);
cache.set(key, result);
return result;
};
}
const memoFib = memoize(function fib(n) {
if (n <= 1) return n;
return memoFib(n - 1) + memoFib(n - 2);
});
console.log(memoFib(40)); // Very fast — each value calculated only once
for...in and for...of? Beginnerfor...in— iterates over the enumerable property keys of an object (including inherited ones)for...of— iterates over the values of any iterable (arrays, strings, Maps, Sets, generators)
const obj = { a: 1, b: 2, c: 3 };
for (const key in obj) {
console.log(key); // "a", "b", "c" — keys, not values
}
const arr = [10, 20, 30];
for (const value of arr) {
console.log(value); // 10, 20, 30 — values
}
// for...in on an array gives index keys — usually not what you want
for (const key in arr) {
console.log(key); // "0", "1", "2" — index strings
}
for...of for arrays and iterables. Use for...in only for plain objects when you need to iterate keys.WeakMap and WeakSet hold weak references to their keys/values — meaning the garbage collector can collect them if no other references exist. They are not iterable and don't prevent memory leaks.
// WeakMap — private data storage per object instance
const privateData = new WeakMap();
class User {
constructor(name, password) {
privateData.set(this, { password }); // password stored privately
this.name = name;
}
checkPassword(input) {
return privateData.get(this).password === input;
}
}
const user = new User("Saiful", "secret123");
console.log(user.checkPassword("secret123")); // true
console.log(user.password); // undefined — truly private
// When user is garbage collected, its WeakMap entry is automatically removed
// No memory leak!
Section 2 – Asynchronous JavaScript
JavaScript is single-threaded — it can only execute one thing at a time. The Event Loop is the mechanism that allows JavaScript to handle asynchronous operations without blocking the main thread.
- Call Stack — where synchronous code executes, one frame at a time
- Web APIs / Node.js APIs — where async operations (setTimeout, fetch, I/O) actually run
- Callback Queue (Task Queue) — where callbacks from async operations wait
- Microtask Queue — higher-priority queue for Promise callbacks and
queueMicrotask() - Event Loop — continuously checks if the call stack is empty, then pulls tasks from queues
console.log("1 - Start");
setTimeout(() => console.log("2 - setTimeout"), 0);
Promise.resolve().then(() => console.log("3 - Promise"));
queueMicrotask(() => console.log("4 - Microtask"));
console.log("5 - End");
// Output order:
// 1 - Start
// 5 - End
// 3 - Promise ← microtask queue runs before callback queue
// 4 - Microtask ← microtask queue
// 2 - setTimeout ← callback queue (runs last)
setTimeout callbacks, even with setTimeout(fn, 0).A Promise is an object representing the eventual completion or failure of an asynchronous operation. It has three states:
- Pending — initial state, neither fulfilled nor rejected
- Fulfilled — the operation completed successfully
- Rejected — the operation failed
Once a Promise settles (fulfilled or rejected), it cannot change state again.
// Creating a Promise
function fetchUserData(userId) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (userId > 0) {
resolve({ id: userId, name: "Saiful" }); // success
} else {
reject(new Error("Invalid user ID")); // failure
}
}, 1000);
});
}
// Consuming a Promise
fetchUserData(1)
.then(user => console.log("Got user:", user))
.catch(error => console.error("Error:", error.message))
.finally(() => console.log("Request complete")); // always runs
Promise.all, Promise.allSettled, Promise.race, and Promise.any? Intermediate| Method | Resolves When | Rejects When | Use Case |
|---|---|---|---|
Promise.all | ALL promises fulfill | ANY promise rejects | All tasks must succeed |
Promise.allSettled | ALL promises settle (any outcome) | Never rejects | Need results of all, even failures |
Promise.race | FIRST promise settles | FIRST promise rejects | Timeouts, first-wins scenarios |
Promise.any | FIRST promise fulfills | ALL promises reject | Try multiple sources, take first success |
const p1 = fetch("/api/users");
const p2 = fetch("/api/products");
const p3 = fetch("/api/orders");
// Wait for all — fails if any one fails
const [users, products, orders] = await Promise.all([p1, p2, p3]);
// Wait for all regardless of failure
const results = await Promise.allSettled([p1, p2, p3]);
results.forEach(result => {
if (result.status === "fulfilled") console.log(result.value);
else console.error(result.reason);
});
// Timeout pattern with Promise.race
const timeout = new Promise((_, reject) =>
setTimeout(() => reject(new Error("Timeout!")), 5000)
);
const data = await Promise.race([fetch("/api/data"), timeout]);
async/await is syntactic sugar over Promises. An async function always returns a Promise. The await keyword pauses execution of the async function until the awaited Promise settles, without blocking the main thread.
// Promise chain — harder to read
function getUserOrders(userId) {
return fetchUser(userId)
.then(user => fetchOrders(user.id))
.then(orders => processOrders(orders))
.catch(error => handleError(error));
}
// async/await — same logic, much cleaner
async function getUserOrders(userId) {
try {
const user = await fetchUser(userId);
const orders = await fetchOrders(user.id);
return await processOrders(orders);
} catch (error) {
handleError(error);
}
}
// Running async calls in parallel (don't await each one sequentially!)
async function loadDashboard(userId) {
// ❌ Sequential — slow (waits for each before starting next)
const user = await fetchUser(userId);
const posts = await fetchPosts(userId);
const stats = await fetchStats(userId);
// ✅ Parallel — fast (all start at the same time)
const [user, posts, stats] = await Promise.all([
fetchUser(userId),
fetchPosts(userId),
fetchStats(userId)
]);
}
Callback hell (also called the "pyramid of doom") occurs when callbacks are nested multiple levels deep, making code hard to read, maintain, and debug.
// Callback hell — deeply nested, hard to follow
getUser(userId, function(user) {
getOrders(user.id, function(orders) {
getProducts(orders[0].id, function(products) {
getReviews(products[0].id, function(reviews) {
// ... more nesting
console.log(reviews); // buried 4 levels deep
});
});
});
});
// Solution 1: Promises
getUser(userId)
.then(user => getOrders(user.id))
.then(orders => getProducts(orders[0].id))
.then(products => getReviews(products[0].id))
.then(reviews => console.log(reviews))
.catch(console.error);
// Solution 2: async/await (cleanest)
async function loadData(userId) {
const user = await getUser(userId);
const orders = await getOrders(user.id);
const products = await getProducts(orders[0].id);
const reviews = await getReviews(products[0].id);
console.log(reviews);
}
Generator functions (declared with function*) can pause and resume their execution using the yield keyword. They return a Generator object that implements the Iterator protocol.
function* numberGenerator() {
console.log("Start");
yield 1;
console.log("After 1");
yield 2;
console.log("After 2");
yield 3;
}
const gen = numberGenerator();
console.log(gen.next()); // "Start" → { value: 1, done: false }
console.log(gen.next()); // "After 1" → { value: 2, done: false }
console.log(gen.next()); // "After 2" → { value: 3, done: false }
console.log(gen.next()); // → { value: undefined, done: true }
// Practical use: infinite sequence without memory issues
function* infiniteIds() {
let id = 1;
while (true) {
yield id++;
}
}
const idGen = infiniteIds();
console.log(idGen.next().value); // 1
console.log(idGen.next().value); // 2
// ... generates on demand, no array in memory
Section 3 – Node.js Core Concepts
Node.js is a server-side JavaScript runtime built on Chrome's V8 engine. It executes JavaScript outside the browser, enabling you to build web servers, CLI tools, APIs, and more.
| Feature | Browser JavaScript | Node.js |
|---|---|---|
| Environment | Browser | Server / system |
| DOM access | Yes (window, document) | No |
| File system access | No | Yes (fs module) |
| Network access | Limited (CORS restricted) | Full (HTTP, TCP, UDP) |
| Module system | ES Modules (native) | CommonJS + ES Modules |
| Global object | window | global / globalThis |
| Feature | CommonJS (CJS) | ES Modules (ESM) |
|---|---|---|
| Syntax | require() / module.exports | import / export |
| Loading | Synchronous | Asynchronous |
| Tree-shaking | Not supported | Supported |
| File extension | .js or .cjs | .mjs or .js with "type":"module" |
| Default in Node.js | Yes (legacy default) | Opt-in |
// CommonJS
const express = require("express");
const { readFile } = require("fs");
module.exports = { myFunction };
// ES Modules
import express from "express";
import { readFile } from "fs/promises";
export { myFunction };
export default myClass;
Streams are objects that let you read data from a source or write data to a destination in a continuous fashion — chunk by chunk, rather than loading everything into memory at once. They are critical for handling large files, HTTP requests, and real-time data efficiently.
There are four types of streams: Readable, Writable, Duplex (both), and Transform (duplex that modifies data).
const fs = require("fs");
const zlib = require("zlib");
// Without streams — loads entire file into memory (bad for large files)
const data = fs.readFileSync("huge-file.csv"); // 2GB in RAM!
// With streams — processes chunk by chunk (memory efficient)
const readStream = fs.createReadStream("huge-file.csv");
const writeStream = fs.createWriteStream("output.csv.gz");
const gzip = zlib.createGzip();
// Pipe: read → compress → write
readStream
.pipe(gzip)
.pipe(writeStream)
.on("finish", () => console.log("Done! Large file compressed efficiently"));
// HTTP response as a stream
const http = require("http");
http.createServer((req, res) => {
const fileStream = fs.createReadStream("large-video.mp4");
fileStream.pipe(res); // stream file directly to HTTP response
}).listen(3000);
cluster module? AdvancedNode.js runs on a single thread, which means it cannot natively use multiple CPU cores. The cluster module allows you to spawn multiple worker processes (one per CPU core) that all share the same server port, enabling true parallelism.
const cluster = require("cluster");
const http = require("http");
const os = require("os");
if (cluster.isPrimary) {
const numCPUs = os.cpus().length;
console.log(`Primary ${process.pid} running — spawning ${numCPUs} workers`);
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on("exit", (worker) => {
console.log(`Worker ${worker.process.pid} died — restarting...`);
cluster.fork(); // auto-restart dead workers
});
} else {
// Worker process — each runs an HTTP server on the same port
http.createServer((req, res) => {
res.writeHead(200);
res.end(`Response from worker ${process.pid}`);
}).listen(3000);
console.log(`Worker ${process.pid} started`);
}
pm2 start app.js -i max — no manual cluster code needed.process.nextTick() vs setImmediate() in Node.js? Advancedprocess.nextTick() | setImmediate() | |
|---|---|---|
| Queue | nextTick queue (highest priority) | Check phase of event loop |
| Runs when | After current operation, before I/O | After I/O callbacks |
| Priority | Higher than Promises | Lower than nextTick and Promises |
console.log("1 - sync");
setImmediate(() => console.log("2 - setImmediate"));
process.nextTick(() => console.log("3 - nextTick"));
Promise.resolve().then(() => console.log("4 - Promise"));
console.log("5 - sync");
// Output:
// 1 - sync
// 5 - sync
// 3 - nextTick ← nextTick queue (before microtasks in Node.js)
// 4 - Promise ← microtask queue
// 2 - setImmediate ← check phase
Section 4 – Express.js & REST APIs
Express.js is a minimal, unopinionated web framework for Node.js. It provides a thin layer of fundamental web application features without obscuring Node's features.
Core features: routing, middleware support, template engine integration, static file serving, error handling, and HTTP utility methods.
const express = require("express");
const app = express();
// Parse JSON request bodies
app.use(express.json());
// Route handlers
app.get("/api/products", async (req, res) => {
const products = await Product.findAll();
res.json(products);
});
app.post("/api/products", async (req, res) => {
const { name, price } = req.body;
const product = await Product.create({ name, price });
res.status(201).json(product);
});
app.put("/api/products/:id", async (req, res) => {
const product = await Product.findByIdAndUpdate(req.params.id, req.body);
if (!product) return res.status(404).json({ message: "Not found" });
res.json(product);
});
app.delete("/api/products/:id", async (req, res) => {
await Product.findByIdAndDelete(req.params.id);
res.status(204).send();
});
app.listen(3000, () => console.log("Server running on port 3000"));
Middleware in Express is a function with access to the request object (req), response object (res), and the next function. Middleware can execute code, modify req/res, end the request-response cycle, or call the next middleware.
// Custom authentication middleware
function authenticate(req, res, next) {
const token = req.headers.authorization?.split(" ")[1];
if (!token) {
return res.status(401).json({ message: "No token provided" });
}
try {
const decoded = jwt.verify(token, process.env.JWT_SECRET);
req.user = decoded; // attach user to request
next(); // pass control to next middleware/route
} catch (error) {
res.status(401).json({ message: "Invalid token" });
}
}
// Request logger middleware
function requestLogger(req, res, next) {
console.log(`${new Date().toISOString()} ${req.method} ${req.path}`);
next();
}
// Apply globally
app.use(requestLogger);
// Apply to specific routes
app.get("/api/profile", authenticate, (req, res) => {
res.json(req.user);
});
Express has a special error-handling middleware with four parameters (err, req, res, next). It must be registered after all other routes and middleware.
// Custom error class
class AppError extends Error {
constructor(message, statusCode) {
super(message);
this.statusCode = statusCode;
this.isOperational = true;
}
}
// Async error wrapper — avoids try/catch in every route
const asyncHandler = (fn) => (req, res, next) =>
Promise.resolve(fn(req, res, next)).catch(next);
// Route using asyncHandler
app.get("/api/users/:id", asyncHandler(async (req, res) => {
const user = await User.findById(req.params.id);
if (!user) throw new AppError("User not found", 404);
res.json(user);
}));
// Global error handler (must be last)
app.use((err, req, res, next) => {
const statusCode = err.statusCode || 500;
const message = err.isOperational ? err.message : "Internal server error";
console.error("Error:", err);
res.status(statusCode).json({
status: "error",
message,
...(process.env.NODE_ENV === "development" && { stack: err.stack })
});
});
A well-structured Express project separates concerns and scales cleanly as the codebase grows:
project/ ├── src/ │ ├── controllers/ # Handle HTTP requests/responses │ │ └── userController.js │ ├── services/ # Business logic (independent of HTTP) │ │ └── userService.js │ ├── repositories/ # Database access layer │ │ └── userRepository.js │ ├── middleware/ # Custom middleware │ │ ├── auth.js │ │ └── errorHandler.js │ ├── routes/ # Route definitions │ │ └── userRoutes.js │ ├── models/ # Data models/schemas │ │ └── User.js │ ├── utils/ # Helpers and utilities │ ├── config/ # Configuration files │ └── app.js # Express app setup ├── tests/ ├── .env └── server.js # Entry point (starts the server)
app.js for Express configuration and middleware, and your server.js only for starting the server. This makes testing easier — you can import app without actually starting a server.Section 5 – Advanced & Performance Topics
| Feature | SQL (PostgreSQL, MySQL) | NoSQL (MongoDB, Redis) |
|---|---|---|
| Schema | Fixed, predefined | Flexible, dynamic |
| Data model | Tables and rows | Documents, key-value, graphs |
| Relationships | Foreign keys, JOINs | Embedded documents or references |
| Scaling | Vertical (primarily) | Horizontal |
| Best for | Complex queries, transactions | Unstructured data, high write volume |
Popular Node.js ORMs/ODMs: Prisma or Sequelize for SQL, Mongoose for MongoDB.
API security in Node.js covers multiple layers:
const helmet = require("helmet");
const rateLimit = require("express-rate-limit");
const mongoSanitize = require("express-mongo-sanitize");
const xss = require("xss-clean");
// 1. Security headers
app.use(helmet());
// 2. Rate limiting — prevent brute force
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100,
message: "Too many requests"
});
app.use("/api", limiter);
// 3. Data sanitization — prevent NoSQL injection
app.use(mongoSanitize());
// 4. XSS protection
app.use(xss());
// 5. CORS configuration
app.use(cors({
origin: ["https://yourdomain.com"],
methods: ["GET", "POST", "PUT", "DELETE"],
credentials: true
}));
// 6. Validate and sanitize all input
// 7. Use HTTPS only
// 8. Store secrets in environment variables, never hardcode
// 9. Hash passwords with bcrypt (never store plain text)
// 10. Use parameterized queries to prevent SQL injection
Caching stores frequently accessed data in fast storage to reduce database load and improve response times. Redis is the standard caching layer for Node.js applications.
const redis = require("ioredis");
const client = new redis(process.env.REDIS_URL);
// Cache middleware
function cacheMiddleware(ttlSeconds = 60) {
return async (req, res, next) => {
const key = `cache:${req.originalUrl}`;
const cached = await client.get(key);
if (cached) {
return res.json(JSON.parse(cached)); // serve from cache
}
// Override res.json to intercept and cache the response
const originalJson = res.json.bind(res);
res.json = async (data) => {
await client.setex(key, ttlSeconds, JSON.stringify(data));
return originalJson(data);
};
next();
};
}
// Apply to routes
app.get("/api/products", cacheMiddleware(300), async (req, res) => {
const products = await Product.findAll(); // only hits DB on cache miss
res.json(products);
});
| Feature | HTTP | WebSocket |
|---|---|---|
| Connection | Short-lived, request-response | Persistent, full-duplex |
| Direction | Client initiates every time | Both client and server can send anytime |
| Overhead | Headers on every request | Small framing headers after handshake |
| Use case | Standard web APIs, forms | Chat, live feeds, gaming, collaboration |
const { WebSocketServer } = require("ws");
const wss = new WebSocketServer({ port: 8080 });
const clients = new Set();
wss.on("connection", (ws) => {
clients.add(ws);
console.log("Client connected");
ws.on("message", (data) => {
const message = JSON.parse(data);
// Broadcast to all connected clients
clients.forEach(client => {
if (client.readyState === ws.OPEN) {
client.send(JSON.stringify({
user: message.user,
text: message.text,
time: new Date().toISOString()
}));
}
});
});
ws.on("close", () => clients.delete(ws));
});
- Use async/await properly — never block the event loop with synchronous operations (
readFileSync,JSON.parseon huge strings) - Use Streams for large file operations and HTTP responses
- Implement caching with Redis for expensive database queries
- Use connection pooling — don't create new DB connections per request
- Enable gzip/Brotli compression with the
compressionmiddleware - Use clustering or worker threads for CPU-bound tasks
- Implement pagination — never return unbounded datasets
- Profile with Node.js built-in profiler or clinic.js to find bottlenecks
// Pagination example
app.get("/api/products", async (req, res) => {
const page = parseInt(req.query.page) || 1;
const limit = Math.min(parseInt(req.query.limit) || 20, 100);
const offset = (page - 1) * limit;
const { count, rows } = await Product.findAndCountAll({
limit,
offset,
order: [["createdAt", "DESC"]]
});
res.json({
data: rows,
pagination: {
total: count,
page,
limit,
totalPages: Math.ceil(count / limit)
}
});
});
💼 Interview Tips for JavaScript & Node.js Roles
- Be ready to predict output. Interviewers frequently show you a code snippet and ask what it logs. Practice tracing async code, closures, and hoisting manually.
- Know the event loop cold. Draw the call stack, microtask queue, and callback queue on a whiteboard if asked. Explaining why Promises resolve before setTimeout shows deep understanding.
- Understand when NOT to use Node.js. It's not ideal for CPU-heavy computations — mention Worker Threads as the solution. This shows maturity.
- Know at least one Node.js framework deeply — Express is minimum, bonus for NestJS or Fastify knowledge.
- Security questions are common in senior roles — always mention helmet, rate limiting, input validation, and parameterized queries.
- Demonstrate async best practices — parallel Promise.all vs sequential await, proper error handling, avoiding unhandled rejections.
❓ Frequently Asked Questions
No — Node.js is optimized for I/O-bound tasks (file reads, database queries, HTTP requests). For CPU-intensive work (image processing, machine learning, complex computations), the single-threaded event loop will be blocked. Use Worker Threads, child processes, or offload to a dedicated service written in a language better suited for CPU work like Python or Go.
Deno is a modern JavaScript/TypeScript runtime created by the original creator of Node.js, Ryan Dahl, to address Node's design mistakes. Key differences: Deno has TypeScript support built-in, uses ES Modules only, has a permission-based security model, and uses URLs for imports instead of npm. Node.js has a vastly larger ecosystem and is far more widely used in production.
Learn Express first to understand the fundamentals — middleware, routing, manual structure. Then learn NestJS if you are targeting enterprise roles or working in TypeScript-first teams. NestJS provides opinionated architecture (similar to ASP.NET Core with DI, decorators, and modules) that scales better in large teams.
require() and import in Node.js?require() is CommonJS — synchronous, works everywhere in Node.js by default. import is ES Module syntax — asynchronous, requires either a .mjs extension or "type": "module" in package.json. ES Modules support tree-shaking and are the modern standard, but CommonJS still dominates the existing Node.js ecosystem.
Always attach a .catch() handler to every Promise or use try/catch inside async functions. Globally, you can listen to the unhandledRejection process event to log and gracefully shut down: process.on('unhandledRejection', (reason) => { console.error(reason); process.exit(1); });
✅ Key Takeaways
- Always use
constby default,letwhen needed, and nevervarin modern JavaScript - Closures remember variables from their outer scope — they are the foundation of many JavaScript patterns
- The Event Loop is what makes Node.js non-blocking — the microtask queue (Promises) always runs before the callback queue (setTimeout)
Promise.allfor parallel success-or-fail,Promise.allSettledwhen you need all results regardless of failure- Use streams for large data processing — never load a 1GB file into memory when you can pipe it
- Structure Express apps in layers: routes → controllers → services → repositories
- Security fundamentals: helmet, rate limiting, input validation, bcrypt for passwords, parameterized queries
- Node.js is ideal for I/O-bound work — use Worker Threads for CPU-intensive tasks
Found this helpful? Share it with someone preparing for a JavaScript interview. Have a question we didn't cover? Drop it in the comments — we read and respond to every one.
No comments:
Post a Comment
Please keep your comments relevant.
Comments with external links and adult words will be filtered.