Share Data Between JavaScript Modules
Imports, Globals, Dispatched Events, and Inter-Process Communication. Learn different techniques to share data between modules in the Browser & those in Node.js, as well as the pros & cons for each.
In Javascript, the module is just another JavaScript file, that exports items that can be imported into other JS files. When using modules data coming out is in direct contact with the handler using it.
Here is a quick example:
// math.js (module)
function sum (a, b) {
return a+b;
}
// export a function sum so that is available out of this file
export sum;
// app.js (es modules)
// import a function sum by providing file path
import { sum } = from './math';
// using a function from math.js
console.log('Sum is: ', sum(2, 3)); // 5
Two most common ways to import modules:
- CommonJS Modules (Server only)
- ECMAScript (ES) Modules (Browser and Server)
Using Modules
The modules let you define a set of items that you can export outside:
// my-module.js (Common.js)
// functions
function sum(a, b) {
return a + b;
}
// variables
const language = 'JavaScript';
// objects
const appConfig = {
port: 3000,
}
// classes
class User {
static name = 'Mirza'
}
// exporting members one by one
exports.sum = sum;
exports.language = language;
exports.config = appConfig;
exports.User = User;
And imported elsewhere:
// app.js
// importing all exported members
const { sum, language, config, User } = require('./my-module');
console.log('User.name :>> ', User.name); // Mirza
console.log('config :>> ', config); // { port: 3000 }
console.log('language :>> ', language); // JavaScript
console.log('Sum is: ', sum(2, 3)); // 5
In Object-Oriented terms, a module is a singleton that persists data during the runtime of the application and you can have as many modules in your application as you desire.
If you want to learn more about a JavaScript module system, be sure to check out my blog below:
Just like a Window object in the web browser, Node.js also uses top-level objects that are accessible in any file, without needing to import them. In Node.js these are called Global namespace objects.
Node.js comes with a list of Global objects out of the box:
- Process
- Fetch
- Console
- Buffer
- Date
- Require, etc.
See more at the official docs.
Custom global variables
Node.js also lets you create custom globals, for example:
// random-file.js
global.blog = 'Share Data Between Node.js Modules';
To use this global variable elsewhere, you still need to import this file into your main application file.
However, you do not need to export this global variable.
// app.js
require('./random-file');
console.log('blog :>> ', global.blog);
// blog :>> Share Data Between Node.js Modules
This can be beneficial when you have a global application config that you can export and use elsewhere:
// app-config.js
global.appConfig = {
port: 9000,
databaseURL: 'DB-CONNECTION-STRING',
secret: 'MY_SECRET'
};
// app.js
require('./app-config');
console.log(appConfig);
/*
{
port: 9000,
databaseURL: 'DB-CONNECTION-STRING',
secret: 'MY_SECRET'
}
*/
The Frontend JavaScript also supports a set of built-in globals:
- Window
- Document
- Console
- Fetch, etc.
Creating a global
And also allows us to create custom globals using the Window object. For example, you set the name of your global and assign it a value:
// date.helper.js
window.currentDateRomanNumerals = 'IX-II-MMXXIV'; // 09 Feb 2024
Using the global value
Then in the other file, you can print this value using the Window object:
// we import the date helper file to link files together
import * as dateHelper from './date.helper.js';
console.log('window.currentDateRomanNumerals :>> ',
window.currentDateRomanNumerals);
// you should see this value printed in the browser console
Notice that we can use window.currentDateRomanNumerals
without exporting it from the date helper, nor importing it manually.
Node.js uses the events module (Event-Driven Architecture) to transfer data between modules based on the Publisher-Subscriber pattern. This pattern is repeated when across Node.js — with streams, sockets, processes, and web servers.
Two key methods:
- emit — used when sending the data, e.g.
.emit(eventName, data)
- on — used for listening to data, e.g.
.on(eventName, (data) => {})
There are other methods that we’ll touch upon later.
Share data Parent to Child
The parent module can send messages to one or multiple child modules.
We start by creating a parent file that will be in charge of dispatching (emitting) messages. First, we initialize the Event Emitter:
// parent.js
const { EventEmitter } = require('events');
const greetingEmitter = new EventEmitter();
module.exports = { greetingEmitter };
Then we set a message to be sent
// parent.js
const { EventEmitter } = require('events');
const greetingEmitter = new EventEmitter();
module.exports = { greetingEmitter };
function sendGreeting() {
// pass the event name = "greetings" and a message "Hello World!";
greetingEmitter.emit('greetings', 'Hello World!');
console.log('Parent sent message!')
};
setTimeout(() => {
sendGreeting();
}, 1000)
I introduced a delay before calling the sendGreeting()
function because I want to send the message only after the child has subscribed to the event. (The name of the event is greetings
.)
Child module
The child will inherit this message by importing the greetingEmitter
from the parent file. The child also listens to the event we sent in the parent file (greetings
).
// Corrected to parent module (child.js)
const { greetingEmitter } = require('./parent');
console.log('Before the event listen!');
// listening for the event from parent
greetingEmitter.on('greetings', message => {
console.log(`Message from parent: ${message}`);
});
Main entry point
To be able to send and receive messages and see both printed in the console, we need to run both parent.js
and child.js
on the same process. Whenever you run a file using node, e.g: $ node app.js
, you’re essentially starting a process within the JS file.
With this in mind, create a main.js
file that will import both files and then this main.js file:
// entry point for running the events (main.js)
require('./parent');
require('./child');
$ node main.js
Before the event listen!
Message from parent: Hello World!
Parent sent message!
We’ve successfully connected the modules!
Share data Child to Parent
Likewise, we can send data from the child to the parent using the same emitter.
// child.js
const { greetingEmitter } = require('./parent');
console.log('Before the event listen!');
greetingEmitter.on('greetings', message => {
console.log(`Message from parent: ${message}`);
// send data to the parent
greetingEmitter.emit('response', { message: 'How do you do', timestamp: Date.now() });
console.log('Child sent message!');
});
The parent receives the data
Inside the parent, we listen to the response
event sent from the child:
const { EventEmitter } = require('events');
const greetingEmitter = new EventEmitter();
module.exports = { greetingEmitter };
function sendGreeting() {
greetingEmitter.emit('greetings', 'Hello World!');
console.log('Parent sent message!')
};
// data coming from the child
greetingEmitter.on('response', message => {
console.log(`Message from 1st child: ${JSON.stringify(message, null, 2)}`);
});
setTimeout(() => {
sendGreeting();
}, 1000)
Running files again
$ node main.js
Before the event listen!
Message from parent: Hello World!
Message from 1st child: {
"message": "How do you do",
"timestamp": 1706436908246
}
Child sent message!
Parent sent message!
It’s important to note that both greetings
and response
are custom events that you can name however you like.
Read more on Node.js Events:
Built into the web browser is Custom Events API which allows you to transfer data via events. You create an event that can be dispatched in part of the application (file) and subscribed onto in another.
HTML Setup
<body>
<form class="color-form">
<input type="text" placeholder="Enter color"/>
<button type="submit">Change</button>
</form>
</body>
<script src="./index.js" type="module"></script>
JavaScript main file (index.js)
First, we set an on-submit event handler for the HTML form:
const colorForm = document.querySelector('.color-form');
colorForm.addEventListener('submit', (event) => {
event.preventDefault();
const inputValue = event.target[0].value;
// ...
});
Then import changePageBackground
function from the child module and call it within the event listener.
import { changePageBackground } from './page-events.handler.js';
const colorForm = document.querySelector('.color-form');
colorForm.addEventListener('submit', (event) => {
event.preventDefault();
const inputValue = event.target[0].value;
// creates event trigger
changePageBackground(inputValue);
});
JavaScript child file (page-events.handler.js)
Inside the changePageBackground
function, create a new CustomEvent
using the color typed on the input.
/**
* @param {string} backgroundColor
*/
export const changePageBackground = (backgroundColor) => {
const event = new CustomEvent('changeBackgroundEvt', {
detail: {
pageBackground: backgroundColor,
},
});
// dispatch event on the global window object
window.dispatchEvent(event);
};
Let’s elaborate on what happened here:
- We created a custom event called
changeBackgroundEvt
- The event data is stored within the
detail
object (that is built into theCustomEvent
class). - We
dispatch
an event using the window global object
Now this event lives on the window
object and we need to subscribe to it. This can be done anywhere in the app (e.g. index.js).
import { changePageBackground } from './page-events.handler.js';
const colorForm = document.querySelector('.color-form');
colorForm.addEventListener('submit', (event) => {
...
});
// subscribe to color change event
window.addEventListener('changeBackgroundEvt', (event) => {
const pageBody = document.querySelector('body');
// extract color from event.detail and change page body style
pageBody.style.background = event.detail.pageBackground;
});
Now the page background color should change every time we submit the form with a new color.
We could have also created a handler of an HTML element (e.g. page body), passed it around, and used it to dispatch and listen to events, but for the sake of simplicity, the window will do.
The final method is to use the IPC or the Inter-Process Communication through the Child Process module in Node.js.
When we used the events, we created two separate files, parent.js
and child.js
, and wired them together via the main.js
file so that they run in the same process. With Child processes, each file will run its process and they’ll communicate via IPC protocol.
Share data Child to Parent
Using the fork()
function we’ll link up two JavaScript files. The file calling the fork()
function is a parent process, while the file being forked is a child process.
The parent file imports the child_process
module and forks the child file:
// parent.js
const childProcess = require('child_process');
// register a child module
const child = childProcess.fork('./child.js'); // path to child file
Inside a child file, we’ll create a function that will send data to the parent process (module) using process.send()
:
// child.js
function sendToParent(message) {
// send a message using the current process
process.send(message);
}
sendToParent('Hello from Child component!');
Back into the parent file, we’ll listen to this message using the on()
function and message
event:
// parent.js
const childProcess = require('child_process');
const child = childProcess.fork('./child.js');
// the message event is built into Node.js
child.on('message', (message) => {
console.log(message);
})
$ node parent.js
Hello from Child component!
Furthermore, we can receive a message when the child process is no longer been used using the exit
event.
// parent.js
const childProcess = require('child_process');
const child = childProcess.fork('./child.js');
child
.on('message', (message) => {
console.log(message);
})
.on('exit', () => {
console.log('Child process has been terminated!');
});
$ node parent.js
Hello from Child component!
Child process has been terminated!
Share data Parent to Child
There are two ways a parent process can pass data to the child:
- Passing arguments
- Passing data using IPC
Passing arguments from Parent to Child
Running JavaScript files through Node.js is as simple as doing:
$ node my-file.js
But we can also pass arguments:
$ node my-file.js Hello World
These arguments can be read inside the file we’re executing.
The data arrives in an array. We cut off the first two arguments because they contain info about the current working directory that we do not need.
// inside my-file.js
const dataReceived = process.argv.slice(2);
// ['Hello', 'World']
Following the same principle we can pass data from parent to child process. Inside parent file:
// parent.js
const childProcess = require('child_process');
const childData = ['Mirza','dev'];
const child = childProcess.fork('./child.js', childData);
Inside the child process:
// child.js
const dataReceived = process.argv.slice(2);
console.log('Data from parent :>> ', dataReceived);
Passing data using IPC
We’ll use the child process instance that we forked into the parent to send data from parent to child.
// parent.js
const childProcess = require('child_process');
// child instance
const child = childProcess.fork('./child.js');
// send data to the child after a second
setTimeout(() => {
child.send({ name: 'Mirza', job: 'dev', lng: ['JS', 'C#'] });
}, 1000);
The child process will receive this message and handle it:
// child.js
// data received from the parent using IPC
process.on('message', (message) => {
console.log(`Message from parent component:
${JSON.stringify(message, null, 2)}`);
});
$ node parent.js
Message from parent component: {
"name": "Mirza",
"job": "dev",
"lng": [
"JS",
"C#"
]
}
The IPC approach also allows Duplex communication — sending data from parent to child and child to parent.
Closing the connection
We’ve just sent data from parent to child, but the connection is still open, which is a memory leak. To terminate the connection we’ll call the kill()
method on the forked child process in the parent file.
// parent.js
const childProcess = require('child_process');
const childData = ['Mirza','dev'];
const childP = childProcess.fork('./child.js', childData);
childP
.on('message', (message) => {
console.log(message);
})
.on('exit', () => {
console.log('Child process has been terminated!');
});
setTimeout(() => {
childP.send({ name: 'Mirza', job: 'dev', lng: ['JS', 'C#'] });
}, 1000);
// terminate the child process afer it's no longer been used
setTimeout(() => {
childP.kill();
}, 2000);
Child file (no changes required).
// child.js
const dataReceived = process.argv.slice(2);
function sendToParent(message) {
process.send(message);
}
console.log('Data from parent using arguments :>> ', dataReceived);
sendToParent('Hello from Child component!');
// data received from the parent using IPC
process.on('message', (message) => {
console.log(`Message from parent component: ${JSON.stringify(message, null, 2)}`);
});
$ node parent.js
Data from parent using arguments :>> [ 'Mirza', 'dev' ]
Hello from Child component!
Message from parent component: {
"name": "Mirza",
"job": "dev",
"lng": [
"JS",
"C#"
]
}
Child process has been terminated!
Finally, let’s look at the benefits and drawbacks of each approach.
Import - Export
✔️ Standard Practice
The most commonly used way to transfer data between modules is by exporting and importing items.
✔️ Data Certainty
Import/export allows for direct access to variables, functions, and objects defined in a module.
Data coming out is in direct contact with data coming in.
✔️ Synchronous and Asynchronous communication
CommonJS modules are synchronous, while the ES modules work in both ways. CommonJS import can also be cached.
✔️ Work with a variety of data structures
You can transfer classes, objects, arrays, Promises, primitive values, etc.
✔️ ️️Dynamic Imports
ES Modules support asynchronous loading, allowing developers to load only the necessary parts of the code when needed, e.g. when the button is clicked or the route is visited.
❌ Tight Coupling
If you need to make a change in the exported functionality, you need to make changes in all places importing it.
❌ One-way communication
Module A can import module B, but not vice versa. Otherwise, you’ll get a Circular dependency exception.
❌ Lack of module standardization
On the server side, some packages support only CommonJS modules, while others only ES modules. And some support both.
So when upgrading your dependencies you need to consider how your imports will look like.
Globals
✔️ Easy to read
Does not need import not export. Just assign a value to a global/window object and it’s there.
✔️ Global Settings
If your application has configuration settings that are required across various modules, you can use global variables to make it easily accessible.
❌ Dependency management
When a value can be used anywhere without restrictions, it becomes hard to track data changes and debug.
❌ Produce Memory Leaks
Changing the same global on multiple places doesn’t update the value properly, leading to data mismatch.
❌ Testing
Global objects make it more challenging to isolate and test components independently.
❌ Security
If sensitive information is stored globally in your web browser, it may be accessible by malicious code or unintended parts of the application.
Events
✔️ Non-blocking Execution
When one part of your program emits an event, it doesn’t wait for other parts to handle it immediately.
✔️ Loose Coupling
Events enable loose coupling between different parts of your application. The sender of an event doesn’t need to know anything about the receivers.
✔️ Custom ordering
The sender can notify multiple parts of the application in the order of execution you desire.
✔️ Duplex Communication
The parent can send data to the child and the child can send data to the parent as well.
❌ Uncertain Data Structure
The receivers do not know about the structure or type of data they will receive.
❌ Uncertain Source of Events
The receivers may not know where the events originate, leading to potential challenges with debugging and tracking events.
❌ Potential Memory Leaks
The receivers do not know where the data will arrive. Unless disposed of, they can listen to the event indefinitely.
Inter-Process Communication
✔️ Gain Performance
Using child processes, you can offload heavy processing to a different process and then send the output back to the main process. This means less work (less blocking) on the main process when performing tasks.
✔️ Parallel Execution
Forking allows you to execute multiple instances of your Node.js application in parallel. This is useful on multi-core systems, where you can distribute tasks across different processes to take advantage of the available CPU cores.
✔️ ️Isolation
Each forked process operates in its isolated environment (on its Process). This also means that if one forked process crashes, it doesn’t affect the main application or other forked processes
❌️ Resource overhead
Each forked process consumes additional system resources and memory. Creating too many processes may lead to increased resource consumption and potential performance issues.
❌ CPU limitations
Since each child process is created per CPU core, you’re limited in scope to how many processes you should spawn. Creating more processes than there are cores might result in burnout of resources.
❌ Debugging
Debugging forked processes can be more challenging than debugging a single-process application.
❌ Platform Dependencies
IPC might not work the same on all operating systems.
TLDR;
There isn’t a silver bullet for every scenario. When designing the architecture it’s important to choose the methods that best suit your project needs and align with the budget.
It’s also worth mentioning that there are Worker Threads in JavaScript, both in Node.js and the web browser (Web Workers) that are used to process CPU-bound tasks and send data back and forth using events.
That’s all from me today. If you found this article useful give it a clap.
Also, follow me on Medium, Twitter, and The Practical Dev to stay up to date with my content.
And I’ll see you in the next one 👋