You are viewing a single comment's thread from:

RE: Hivemind is Live!

in #hivemind6 years ago

A full node with only 61gb of ram is the dream ! Congrats team ! Can you guys share more specs about your instances ? How much disk, ssd or nvme, which cpu etc ?

Sort:  
 6 years ago (edited)

We run two types of Steem nodes - one that handles account history, and one that handles everything else except for the tags and follows plugins (because Hivemind handles these plugins itself now). Both types of nodes run on instances in AWS that have 61GB RAM and a single physically attached nVME drive. The shared memory files are stored only on the physically attached drive, no longer in RAM as was previously required. Hivemind itself uses a postgres database in AWS RDS on instances with 32GB RAM. Hivemind's 'app servers' use much smaller instances with only 4GB RAM. All of this has been a phase in our plans to streamline infrastructure for cost effectiveness and there may be further improvements still yet to be made.

So, how many moving parts has Hivemid API instance?

  • app server
  • postgres
  • api node

right?

 6 years ago (edited)

Yes, hivemind requires a postgres database. You need a steemd node or nodes that contain all plugins except for tags/follows. We use jussi for routing, which is a custom reverse proxy and caching layer.

Everything has official docker images available and could be set up to run together with docker-compose.

Well, I’m not a developer guy, I just wanna figure out how much you save on old rpc node vs new set of nodes delta.

With the cost of instances with lots of RAM being very high and fast disk being relatively cheap, quite a lot.

How much 1Gb RAM costs now?

It's not really a full node for only 61GB of ram, it's each node is only 61GB (there are multiple (3) that make up a full node).

Two types of Steem nodes, not three. See below reply to @howo :)

Servers/nodes same thing :)

Just wanted to be clear it wasn’t a single 61gb server making up a full node.

So it kinda like sharding?

Kind of but not really. In sharding you have the same tech but split into chunks.

With this you are splitting into chunks but using different tech (steemd, hivemind, rocksdb) in each section. L

What is rocksdb? I saw mention of postgres above...

Full nodes used to run with all data stored in ram. Rocksdb is a disk database that was added to recent versions of steem to allow account history to be stored on disk thus reducing memory requirements in half.

HiveMind uses a Postgres database.