amuck-landowner

GVH suspended after minor use of 100TB plan

mcmyhost

New Member
No, it's a photo of myself taken recently before I attended a family member's wedding. :)
Multiple Hosts?

3795579


If that doesn't work, view it here
 
Last edited by a moderator:

DomainBop

Dormant VPSB Pathogen
Hello everyone!

 

My name is Nicholas and I'm excited to be a part of this community!

 

I am currently employed in GreenValueHost's Executive Leadership Team as Vice President of Operations and I'll be more than happy to assist with any questions or concerns that anyone may have.

 

If you could please email [email protected] or email me directly at [email protected], I would love to get in touch with you and have this issue looked into as soon as possible.
Welcome 17 year old Vice President of Operations.  I'm curious, did you decide to take the job as GVH VP because you're "bored and thought I  you could really make something of it. :)"

Will you be bringing your fondness for racist memes to GVH to entertain the customers with?
 

Francisco

Company Lube
Verified Provider
Welcome 17 year old Vice President of Operations.  I'm curious, did you decide to take the job as GVH VP because you're "bored and thought I  you could really make something of it. :)"

Will you be bringing your fondness for racist memes to GVH to entertain the customers with?
Looks to be wiped clean.
 

Mun

Never Forget
I added an automated DD test at midnight so we can keep track of there "performance" .

http://192.3.31.219/dd.txt <-- shows the most recent

http://192.3.31.219/dd_historical.txt <-- will show a time lapse over time.

(it might be suggestible for a few of you to pull this file on occasion to keep logs of it in case my server randomly disappears.)


All tests are run with: 'dd if=/dev/zero of=test bs=64k count=16k conv=fdatasync; unlink test'
Mun
 

GVH-Jon

Banned
I added an automated DD test at midnight so we can keep track of there "performance" .

http://192.3.31.219/dd.txt <-- shows the most recent

http://192.3.31.219/dd_historical.txt <-- will show a time lapse over time.

(it might be suggestible for a few of you to pull this file on occasion to keep logs of it in case my server randomly disappears.)


All tests are run with: 'dd if=/dev/zero of=test bs=64k count=16k conv=fdatasync; unlink test'
Mun
The performance of our ny1 node is going to be fluctuating a bit however we have plans to deploy a few 6x Intel 512GB RAID-10 SSD nodes to transfer some accounts from our existing nodes over. We're just waiting for ColoCrossing to sort their things out in Buffalo. :)
 
Last edited by a moderator:

GVH-Jon

Banned
Usual is 20.

If anyone wants me to re-run the script, please let me know.

Mun
That's odd actually, usually it's around 150 for most of our nodes. ny1 is really, really old though, and that's why we're replacing it with a pure SSD node as soon as CC clears up room.
 

Mun

Never Forget
That's odd actually, usually it's around 150 for most of our nodes. ny1 is really, really old though, and that's why we're replacing it with a pure SSD node as soon as CC clears up room.
Well, I'll gladly give you another 5$ a month for the same plan and setup the same scripts on another node. Just PM me.

Mun
 

GVH-Jon

Banned
I'm actually facepalming myself at that meme, you guys aren't alone. Glad Nick removed it though. -_-

Well, I'll gladly give you another 5$ a month for the same plan and setup the same scripts on another node. Just PM me.

Mun
Would you accept Dallas, TX in CoreXChange? You can have as much IPv6 as you'd like.
 

Mun

Never Forget
I'm actually facepalming myself at that meme, you guys aren't alone. Glad Nick removed it though. -_-

Would you accept Dallas, TX in CoreXChange? You can have as much IPv6 as you'd like.

Sure I'll take another server, should I put a PM (i mean ticket) in?
 
Last edited by a moderator:

drmike

100% Tier-1 Gogent
I added an automated DD test at midnight so we can keep track of there "performance" .

http://192.3.31.219/dd.txt <-- shows the most recent

http://192.3.31.219/dd_historical.txt <-- will show a time lapse over time.
Yeah paltry numbers there.

This is from some other BUF node on an account I've had access to.

dd if=/dev/zero of=test bs=64k count=16k conv=fdatasync; unlink test


16384+0 records in


16384+0 records out


1073741824 bytes (1.1 GB) copied, 6.45542 s, 166 MB/s

GVH would do well to move valued customers (some of them) off that NY1 node.
 

sv01

Slow but sure
They aren't just going to let you abuse it though are they? If you're just wget'ing files constantly that's not really using it fairly.
why not fair? since you buy 100TB plan??? I still don't get it.

for example I run backup script every 1-2 hours, and consume about 500 GB that not allowed?? :)
 
Top
amuck-landowner