• Announcements

    • MannDude

      Current state of vpsBoard   02/04/2017

      Dear vpsBoard members and guests:

      Over the last year or two vpsBoard activity and traffic has dwindled. I have had a change of career and interests, and as such am no longer an active member of the web hosting industry.

      Due to time constraints and new interests I no longer wish to continue to maintain vpsBoard. The web site will remain only as an archive to preserve and showcase some of the great material, guides, and industry news that has been generated by members, some of which I remain in contact to this very day and now regard as personal friends.

      I want to thank all of our members who helped make vpsBoard the fastest growing industry forum. In it's prime it was an active and ripe source of activity, news, guides and just general off-topic banter and fun.

      I wish all members and guests the very best, whether it be with your business or your personal projects.

      -MannDude

Search the Community

Showing results for tags 'bash'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • vpsBoard
    • Announcements & Contests
    • Industry News
  • Virtual Private Servers
    • General Talk
    • Operating a VPS Business
    • Tutorials and Guides
    • Questions and Answers
  • The Lounge
    • The Pub (Off topic discussion)
    • Coding, Scripting & Programming
    • SBC, ARM, Home Automation
  • Marketplace & Reviews
    • Reviews
    • VPS Offers
    • Other Offers
    • Service Requests

Found 6 results

  1. For example, say you want to create an archive of a directory to back it up: tar -zcf /some/backup/folder/yoursite_www_$(date +"%m_%d-%Y_%H:%M").tar.gz /home/someuser/somesite/public_htmlSo that'll create an an archive of everything located in /home/someuser/somesite/public_html. Great. But what if you have a folder with large files in it that you want to exclude from being backed up? For example: /home/someuser/somesite/public_html/bigfiles ? How would you exclude that from being archived?
  2. I've been using the DO API to automate some things like VPS reboot and snapshootadding DNS records to configure email with ZohoYou can find the scripts on my github. Feel free to fork away / come with other examples / use cases that you've automated or need automation.
  3. Hey guys. I had a few hours to waste today and I wrote a simple control panel using the SolusVM client API. It's not very useful, but it was fun to write, anyway, even if it was in bash. Here's the code in case anyone is interested in trying it out: https://gist.github.com/qrwteyrutiyoup/6876e6687ac1ac8eb36c
  4. Message bus systems are an easy way to distribute tasks. RabbitMQ is a message broker and there are command line tools too to interact with queues. If you need help to install a RabbitMQ service, To install these tools call: apt-get install amqp-tools The two main methods are: amqp-consume amqp-publish First one is a blocking call that waits for incoming messages. Second one is a call to put a message on a queue. So we do have "workers" that consume a queue and "masters" that are publishing tasks. A worker would look like: amqp-consume -s 127.0.0.1:5672 -q "test" -e "amq.topic" --vhost "/" -r "worker1" --username=guest --password=guest -d ~/onmessage.sh The parameters are: Usage: amqp-consume [-dA?] [-s|--server=hostname:port] [--vhost=vhost] [--username=username] [--password=password] [-q|--queue=queue] [-e|--exchange=exchange] [-r|--routing-key=routing key] [-d|--declare] [-A|--no-ack] [-?|--help] [--usage] [OPTIONS]... <command> <args> So if a message is dropped in the queue "test" and has the routing key "worker1" the command "~/onmessage.sh" is called. The onmessage script might look like this: nano ~/onmessage.sh && chmod +x ~/onmessage.sh With content: read line echo "Message: $line" amqp-consume pipes the content of the message therefore we cannot work with parameters ($1,$2,...) but with read to save the stream into a variable called "line". To publish a message to the "worker1" following command has to be called: amqp-publish -e "amq.topic" -r "worker1" -b "this is a test message" The parameters are: Usage: amqp-publish [OPTIONS]... -e, --exchange=exchange the exchange to publish to -r, --routing-key=routing key the routing key to publish with -p, --persistent use the persistent delivery mode -C, --content-type=content type the content-type for the message -E, --content-encoding=content encoding the content-encoding for the message -b, --body=body specify the message body Connection options -s, --server=hostname:port the AMQP server to connect to --vhost=vhost the vhost to use when connecting --username=username the username to login with --password=password the password to login with I use this pattern for a download service. I send urls to workers that wget the target. You can even put more than one worker on a queue - so the work load is distributed. Or define a second queue were the workers publish results to implement asynchronous method calls. I like the amqp-tools because they can be used with any tool or language.
  5. Shundle is a general sh plugin manager I wrote when I realized how messy my ~/.bashrc was getting. It helps you to manage your aliases, colors and history, although it could be extend to cover anything (it's based on plugins). I created few plugins around it to show the concept; colorize, aliazator, eternalize, right now it adds 0m0.110 seconds with all the plugins enabled and 0m0.048s without any to the average bash startup time (working in getting more shells supported). Note: I tested it in a dual core cpu. It's inspired by vundle[0] and oh-my-zh[1] and it tries to be as less intrusive as possible. Plugins are enabled by placing a Bundle= directive in the shell profile file (~/.bashrc in bash), eg. enabling eternalize: Bundle='chilicuil/shundle-plugins/aliazator.git' And later running: $ shundle install If it still sounds confusing, I've made a screencast at: http://showterm.io/260fe8f71ef23ccf3fd9e Feel free to grab the code and improve it or send suggestions: https://github.com/chilicuil/shundle [0] https://github.com/gmarik/Vundle.vim [1] https://github.com/robbyrussell/oh-my-zsh
  6. wget dl.getipaddr.net/speedtest.sh 2>/dev/null -O- | bash or curl -s dl.getipaddr.net/speedtest.sh -o- | bash I decided to create a script that tests the server's download speed as well as upload. To me, this has been something that's long overdue and I'm surprised that no one has done it yet. Unlike speedtestcli (the python script) which interfaces with speedtest.net, this one uses way less RAM. There's a few options like force 100MB file size tests instead, instructions on how to do that can be found here. Let me know if you guys have any issues. If it's security related, please send it to me via PM.