How to display access log info at console in Hapi Js.

For displaying access log we need to use some plugins of Hapi Js framework which are already in market. E.g good, good-console and good-squeeze.

So as you know How to install Hapi Js. If you have already installed it then start with plugins otherwise read my previous blog regarding How to install Hapi Js.

After installed hapi install above 3 defined plugins with following command.

 
npm install --save good 
npm install --save good-console 
npm install --save good-squeeze 

As you will run these command all the three plugins will installed and will save the dependency in package.json file.

Now then update the server.js file as here.

 
'use strict';  

const Hapi = require('hapi'); 
const Good = require('good');  

const server = new Hapi.Server(); 

server.connection({ 
port: 3000, 
host: 'localhost' 
});  

server.route({     
method: 'GET',     
path: '/',     
handler: function (request, reply) {         
reply('Hello, world!');     
} 
});  

server.register({     
register: Good,     
options: {         
reporters: {             
console: [{                 
module: 'good-squeeze',                 
name: 'Squeeze',                 
args: [{                     
response: '*',                     
log: '*'                 
}]             
}, 
{                 
module: 'good-console'             
}, 'stdout']         
}     
} 
}, (err) => {      
if (err) {         
throw err; // something bad happened loading the plugin     
}      

server.start((err) => {          
if (err) {             
throw err;         
}         

server.log('info', 'Server running at: ' + server.info.uri);     
}); 
});  

Now when the server is started you'll see:

 140625/143008.751, [log,info], data: Server running at: http://localhost:3000 

And if we visit http://localhost:3000/ in the browser, you'll see

 
140625/143205.774, [response], http://localhost:3000: get / {} 200 (10ms) 

So this is the example for display access log info

How To Set Up a Multi-Node Kafka Cluster using KRaft

Setting up a multi-node Kafka cluster using KRaft (Kafka Raft) mode involves several steps. KRaft mode enables Kafka to operate without the need for Apache ZooKeeper, streamlining the architecture and improving management. Here’s a comprehensiv …

read more

Streamline Data Serialization and Versioning with Confluent Schema Registry …

Using Confluent Schema Registry with Kafka can greatly streamline data serialization and versioning in your messaging system. Here's how you can set it up and utilize it effectively: you can leverage Confluent Schema Registry to streamline data seria …

read more