How to use redislab free tier with the Hydra-router docker container? - node.js

I have been trying to implement micro-services using the cool hydra package for micro-services.It uses Redis database for "coordinating" the services.I have setup a free tier on Redislabs.
I have created 2 services using hydra-express and they do run properly.However,I wish to use the hydra-router as I want the services to be accessed from outside the system .And according to this page I need a local Redis setup.What can I do to use the redislab setup itself and run the hydra-router?Also,the Redis database on Redislabs is password protected


PouchDB with CouchDB with Node Web Api Server Layer?

Just trying to spec out a new system. We want to build a mobile app with offline storage using react-native.
We were originally going to use node and create a web api layer that worked with a MongoDB. However after looking into offline storage we found PouchDB. But pouch has to use CouchDB. So we are considering using CouchDb and having a flat db layer with all our business and data logic modelled in the apps.
However there may be some server side things that we need to do (for example pulling other records from somewhere else in the organisation).
So can we deploy a web api using node alongside CouchDb? How would this work? Totally separate or in the same node process?
So can we deploy a web api using node alongside CouchDb?
Yes, of course.
How would this work?
I'm not sure about what solution suites your needs, but one solution is to develop a web app with this tech-stack: NodeJS/Express, CouchDB/PouchDB and ReactJS/MobX. One important point about CouchDB is that, you shouldn't think about it like any other database . Basically, you don't need any server-side code to talk to CouchDB, it can all be done on the client-side with PouchDB. The live sync of PouchDB is extremely convenient at handling offline/online storage. If you're considering ReactJS, take a look at MobX.

AWS EC2 migration to Azure VM using nodsjs code

I need to migrate my AWS EC2 instance to Azure VM using nodejs code. I checked instruction from Azure Site .
So my simple question here can i migrate my EC2 to Azure using nodejs code only. Or Can i replicate above link using my nodejs application
This is quite an open-ended question, and depends a great deal on how your application is currently set up, and what the goals of migrating to Azure are. With that said, here are some avenues for investigation that may help.
Infrastructure as a service
You mention "EC2 instance", so I'm assuming that it's a single VM, rather than a cluster. You can set up VMs in Azure and choose templates for the VMs e.g. here are preconfigured Node.js boxes that you could use as a starting point:
Once set up, deploy to the VM like you would normally deploy your application; making sure to install any dependencies that your application needs. Ideally you'd be automating and scripting this, rather than working directly on the box :-)
Platform as a service
Azure also has an interesting PaaS offering in the form of Azure App Service, and running Linux on App Service is now in preview. (See
There are two ways that you can use this:
If your application runs in a Docker container, then you can create a new app that pulls your container from a Docker registry (e.g. Docker Hub). Find out more here:
You can also set up a new app on the Node.js stack, and git push your code to the provided endpoint. Azure will then build and run your application. Find out more here:
Other considerations
If your application uses services that run on other VMs (e.g. a standalone database), consider moving it over to Azure in parts to simplify the work, and prove at each step that everything still works as expected. For example: move the application servers first. Then the database. Then any other services. This isn't advisable on production databases (you may hit up against any number of issues like running out of connections or creating unexpected locks); and you'll want to be careful of egress bandwidth costs both ways.

Need a solution for an Electron application that uses a shared database

I understand that node-mysql can be used for a database with Electron. However, if I build my app, the user will still need MySQL installed on their computer correct? I need a database solution that multiple users of my app can use without having any other dependancies installed. Just my standalone app. Are there any solutions for this?
You can use PouchDB inside your Electron application and set up a remote CouchDB.
PouchDB can work offline inside your application and can synchronize with CouchDB. If you use sync, every time the remote database changes, all connected applications will pull the latest changes to their local database.
Sync will be in two directions (if you want this, otherwise you can use replicate), so when an application makes a database change inside their local PouchDB, it will synchronize this to the remote CouchDB, and all the other applications will also pull this change.
Well, Correct would be to connect your electron app to a remote DB and setup an auth and set up DB behind that. You can also use DB's auth.
or if you can have individual DB per user. You can use Sqlite.

Digitalocean: cross droplet communication

I have a scenario with two nodejs apps deployed on two Dokku droplets. One of my apps is three admin app which stores data to a mongodb database. The other app is the main application which reads data from that database.
How can I make this main app communicate to the database?
You need to link the database to the dokku container via environment variables. You basically need to follow this methodology:
The database needs to be accessible via an IP and port combination on one of your two servers. If you need both servers to communicate with the database then you will need to make sure it is externally accessible and properly secured (for example via a VPN).
You can then set an environment variable like so:
dokku config:set DB_URL='mongo://'
obviously changing the above to match your setup.
Another potentially easier way of doing the above is to use a dokku plugin which will basically automate those steps.
A list of plugins is available at:
There is a mongo plugin which may suit your needs, I've used some of the others and they work well.

Can Meteor be used with PaaS services?

Am I correct in the assumption that without access to the MongoDB server, there is not much point developing with Meteor?
Meteor is a great framework for building, packaging and deploy apps and sites. From a development POV, the templating and responsive DB work make prototyping so much easier than most MVC's.
I understand that underneath the hood, websockets and DDP provide the realtime sync'ing magic which means that you need access to the MongoDB server, something you don't have with PaaS solutions like GoogleAppEngine, Parse or Kinvey.
So, for the backend developer, they don't derive much benefit from Meteor since they need to maintain the server stack and scalability issues.
Is there a path to create and deploy products with Meteor without having to build and maintain the backend infrastructure? Heroku is still pretty close to the bone when it comes to managing infrastructure.
Wondering if there's a way to have CRUD operations through a REST driver that maps out to whatever PaaS you want and have the PaaS post log changes to a server that strictly handles websocket connections. Basically, pass the CRUD operation to a PaaS and maintain your own websocket server/s.
MeteorPedia has a page on deploying to PaaS:
Recently, Google AppEngine has added support for custom VMs.
You can also use MongoHQ or similar for the database.