UPDATE.mojo: Difference between revisions

From The DXSpider Documentation Wiki
Jump to navigation Jump to search
(Created page with "There are the notes for upgrading to the mojo branch. PLEASE NOTE THERE HAVE BEEN CHANGES FOR all MOJO BRANCH USERS. See APPENDIX(i) at the end of this document. The BIG TICKET ITEM in this branch is that (potentially) '''long lived''' commands such as sh/dx and commands that poll external internet resources now don't halt the flow of data through the node. I am also using a modern, event driven, web socket "manager" called Mojolicious which is considerably more efficie...")
 
No edit summary
 
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
There are the notes for upgrading to the mojo branch. PLEASE NOTE THERE HAVE BEEN CHANGES FOR all MOJO BRANCH USERS. See APPENDIX(i) at the end of this document.
There are the notes for upgrading to the mojo branch. PLEASE NOTE THERE HAVE BEEN CHANGES FOR all MOJO BRANCH USERS. See APPENDIX(i) at the end of this document.


The BIG TICKET ITEM in this branch is that (potentially) '''long lived''' commands such as sh/dx and commands that poll external internet resources now don't halt the flow of data through the node. I am also using a modern, event driven, web socket "manager" called Mojolicious which is considerably more efficient than what went before (but is not necessary for small nodes). There are some 200-400 user nodes out there that will definitely see the difference in terms of both CPU usage and general responsiveness. Using Mojolicious also brings the tantalising possibility of grafting on a web frontend, as it were, to the "side" of a DXSpider node. But serious work on this won't start until we have a stable base to work on. Apart from anything else there will, almost certainly, need to be some internal data structure reorganisation before a decent web frontend could be constructed.
The BIG TICKET ITEM in this branch is that long lived commands such as sh/dx and commands that poll external internet resources now don't halt the flow of data through the node. I am also using a modern, event driven, web socket "manager" called Mojolicious which is considerably more efficient than what went before. Using Mojolicious also brings the tantalising possibility of grafting on a web frontend, as it were, to the "side" of a DXSpider node. Apart from anything else there will, almost certainly, need to be some internal data structure reorganisation before a decent web frontend could be constructed.
 
'''IMPORTANT''' There is an action needed to go from mojo build 228 and below. See items marked '''IMPORTANT''' below.


Upgrading is not for the faint of heart. There is no installation script (but there will be) so, for the time being, you need to do some manual editing. Also, while there is a backward path, it will involve moving various files from their new home (/spider/local_data), back to where they came from (/spider/data).
Upgrading is not for the faint of heart. There is no installation script (but there will be) so, for the time being, you need to do some manual editing. Also, while there is a backward path, it will involve moving various files from their new home (/spider/local_data), back to where they came from (/spider/data).
Line 9: Line 7:
Prerequisites:
Prerequisites:


A supply of good, strong tea - preferably in pint mugs. A tin hat, stout boots, a rucksack with survival rations and a decent miners' lamp might also prove comforting. I enclose this link:
A supply of good, strong tea - preferably in pint mugs. A tin hat, stout boots, a rucksack with survival rations and a decent miners' lamp might also prove comforting. I enclose this link:
http://www.noswearing.com/dictionary in case you run out of swear words.
http://www.noswearing.com/dictionary in case you run out of swear words.


An installed and known working git based installation. Mojo is not supported under CVS or installation from a tarball.
An installed and known working git based installation. Mojo is not supported under CVS or installation from a tarball.
 
<pre>
  perl 5.10.1, preferably 5.14.1 or greater. This basically means running ubuntu 12.04 or later (or one of the other linux distros of similar age or later). The install instructions are for debian based systems. IT WILL NOT WORK WITHOUT A "MODERN" PERL. Yes, you can use bleadperl if you know how to use it and can get it to run the node under it as a daemon without resorting the handy URL supplied above. Personally, I wouldn't bother. It's easier and quicker just to upgrade your linux distro. Apart from anything else things like ssh ntpd are broken on ALL older systems and will allow the ungodly in more easily than something modern.
  perl 5.10.1, preferably 5.14.1 or greater
</pre>
This basically means running ubuntu 12.04 or later (or one of the other linux distros of similar age or later). The install instructions are for debian based systems. IT WILL NOT WORK WITHOUT A "MODERN" PERL. Yes, you can use bleadperl if you know how to use it and can get it to run the node under it as a daemon without resorting the handy URL supplied above. Personally, I wouldn't bother. It's easier and quicker just to upgrade your linux distro. Apart from anything else things like ssh ntpd are broken on ALL older systems and will allow the ungodly in more easily than something modern.


Install cpamminus:
Install cpamminus:


  sudo apt-get install cpanminus
  <b>sudo apt-get install cpanminus</b>
or
or
  wget -O - https://cpanmin.us | perl - --sudo App::cpanminus
  <b>wget -O - https://cpanmin.us | perl - --sudo App::cpanminus</b>
or
or
  sudo apt-get install curl
  <b>sudo apt-get install curl</b>
  curl -L https://cpanmin.us | perl - --sudo App::cpanminus
  <b>curl -L https://cpanmin.us | perl - --sudo App::cpanminus</b>


You will need the following CPAN packages:
You will need the following CPAN packages:


If you are on a Debian based system (Devuan, Ubuntu, Mint etc) that is reasonably new (I use Ubuntu 18.04 and Debian 10) then you can simply do:
If you are on a Debian based system (Devuan, Ubuntu, Mint etc) that is reasonably new (I use Ubuntu 18.04 and Debian 10/11) then you can simply do:
<pre>
 
sudo apt-get install libev-perl libmojolicious-perl libjson-perl libjson-xs-perl libdata-structure-util-perl libmath-round-perl libnet-cidr-lite-perl
<b>sudo apt-get install libev-perl libmojolicious-perl libjson-perl libjson-xs-perl libdata-structure-util-perl libmath-round-perl libnet-cidr-lite-perl</b>
</pre>
 
or on Redhat based systems you can install the very similarly (but not the same) named packages. I don't know the exact names but using anything less than Centos 7 is likely to cause a world of pain. Also I doubt that EV and Mojolicious are packaged for Centos at all.
or on Redhat based systems you can install the very similarly (but not the same) named packages. I don't know the exact names but using anything less than Centos 7 is likely to cause a world of pain. Also I doubt that EV and Mojolicious are packaged for Centos at all.


If in doubt or it is taking too long to find the packages you should build from CPAN. Note: you may need to install the essential packages to build some of these. At the very least you will need to install 'make' (sudo apt-get install make) or just get everything you are likely to need with:
If in doubt or it is taking too long to find the packages you should build from CPAN. Note: you may need to install the essential packages to build some of these. At the very least you will need to install 'make' (sudo apt-get install make) or just get everything you are likely to need with:


  sudo apt-get install build-essential.
  <b>sudo apt-get install build-essential</b>
  sudo cpanm EV Mojolicious JSON JSON::XS Data::Structure::Util Math::Round Net::CIDR::Lite
  <b>sudo cpanm EV Mojolicious JSON JSON::XS Data::Structure::Util Math::Round Net::CIDR::Lite</b>


  # just in case it's missing (top, that is)
  # just in case it's missing (top, that is)
  sudo apt-get install procps
  <b>sudo apt-get install procps</b>


Please make sure that, if you insist on using operating system packages, that your Mojolicious is at least version 7.26. Mojo::IOLoop::ForkCall is NOT LONGER IN USE! The current version at time of writing is 8.36.
Please make sure that, if you insist on using operating system packages, that your Mojolicious is at least version 7.26.  
Mojo::IOLoop::ForkCall is NOT LONGER IN USE!  
The current version at time of writing is 8.36.  


Login as the sysop user.
Login as the <b>sysop</b> user.


Edit your '''/spider/local/DXVars.pm''' so that the bottom of the file is changed from something like:
Edit your <b>/spider/local/DXVars.pm</b> so that the bottom of the file is changed from something like:


---- old ----
-- old --
<pre>
<pre>
        # the port number of the cluster (just leave this, unless it REALLY matters to you)
# the port number of the cluster (just leave this, unless it REALLY matters to you)
        $clusterport = 27754;
$clusterport = 27754;


        # your favorite way to say 'Yes'
# your favorite way to say 'Yes'
        $yes = 'Yes';
$yes = 'Yes';


        # your favorite way to say 'No'
# your favorite way to say 'No'
        $no = 'No';
$no = 'No';


        # the interval between unsolicited prompts if not traffic
# the interval between unsolicited prompts if not traffic
        $user_interval = 11*60;
$user_interval = 11*60;


        # data files live in
# data files live in
        $data = "$root/data";
$data = "$root/data";


        # system files live in
# system files live in
        $system = "$root/sys";
$system = "$root/sys";


        # command files live in
# command files live in
        $cmd = "$root/cmd";
$cmd = "$root/cmd";


        # local command files live in (and overide $cmd)
# local command files live in (and overide $cmd)
        $localcmd = "$root/local_cmd";
$localcmd = "$root/local_cmd";


        # where the user data lives
# where the user data lives
        $userfn = "$data/users";
$userfn = "$data/users";


        # the "message of the day" file
# the "message of the day" file
        $motd = "$data/motd";
$motd = "$data/motd";


        # are we debugging ?
# are we debugging ?
        @debug = qw(chan state msg cron );
@debug = qw(chan state msg cron );
</pre>
</pre>
---- to this: ----
-- to this --
<pre>
<pre>
        # the port number of the cluster (just leave this, unless it REALLY matters to you)
# the port number of the cluster (just leave this, unless it REALLY matters to you)
        $clusterport = 27754;
$clusterport = 27754;


        # your favorite way to say 'Yes'
# your favorite way to say 'Yes'
        $yes = 'Yes';
$yes = 'Yes';


        # your favorite way to say 'No'
# your favorite way to say 'No'
        $no = 'No';
$no = 'No';


        # this is where the paths used to be which you have just removed
# this is where the paths used to be which you have just removed


        # are we debugging ?
# are we debugging ?
        @debug = qw(chan state msg cron );
@debug = qw(chan state msg cron );
</pre>
</pre>
---- new  ------


There may be other stuff after this in DXVars.pm, that doesn't matter. The point is to remove all the path definitions in DXVars.pm. If this isn't clear to you then it would be better if you asked on dxspider-support for help before attempting to go any further.
There may be other stuff after this in DXVars.pm, that doesn't matter. The point is to remove all the path definitions in DXVars.pm. If this isn't clear to you then it would be better if you asked on dxspider-support for help before attempting to go any further.


One of the things that will happen is that several files currently in '''/spider/data''' will be placed in /spider/local_data. These include the user, qsl and usdb data files, the band and prefix files, and various '''bad''' data files. I.e. everything that is modified from the base git distribution.
One of the things that will happen is that several files currently in <b>/spider/data</b> will be placed in /spider/local_data. These include the user, qsl and usdb data files, the band and prefix files, and various <b>bad</b> data files. I.e. everything that is modified from the base git distribution.


Now run the console program or telnet localhost and login as the sysop user.
Now run the console program or telnet localhost and login as the <b>sysop</b> user.


  export_users
  export_users
  bye
  bye


as the '''sysop''' user:
as the <b>sysop</b> user:


  sudo service dxspider stop
  <b>sudo service dxspider stop</b>
or
or
  sudo systemctl stop dxspider
  <b>sudo systemctl stop dxspider</b>


having stopped the node:
having stopped the node:


  mkdir /spider/local_data
  <b>mkdir /spider/local_data</b>
  git reset --hard
  <b>git reset --hard</b>
  git pull --all
  <b>git pull --all</b>
  git checkout --track -b mojo origin/mojo
  <b>git checkout --track -b mojo origin/mojo</b>


if you have not already done this:
if you have not already done this:


  sudo ln -s /spider/perl/console.pl /usr/local/bin/dx
  <b>sudo ln -s /spider/perl/console.pl /usr/local/bin/dx</b>
  sudo ln -s /spider/perl/*dbg /usr/local/bin
  <b>sudo ln -s /spider/perl/*dbg /usr/local/bin</b>


Now in another window run:
Now in another window run:


  watchdbg
  <b>watchdbg</b>


and finally:
and finally:


  sudo service dxspider start
  <b>sudo service dxspider start</b>
or
or
  sudo service systemctl start dxspider
  <b>sudo service systemctl start dxspider</b>
 
 


You should be aware that this code base is now under active development and, if you do a '''git pull''', what you get may be broken. But, if this does happen, the likelihood is that I am actively working on the codebase and any brokenness may be fixed (maybe in minutes) with another '''git pull'''.
<b>APPENDIX(i)</b>


I try very hard not to leave it in a broken state...
Before shutting down to do the update, do a


Dirk G1TLH
<b>sh/ver</b>


and take node of the current git revision number (the hex string after git: mojo/ and the [r]).


DXSpider v1.55 (build <b>249</b> git: master/<b>2fc6c64f</b>[r]) usando perl v5.30.3 en Linux
Derechos de autor (c) 1998-2023 Dirk Koopman G1TLH


'''APPENDIX(i)'''
Also do an


Before shutting down to do the update, do a '''sh/ver''' and take node of the current git revision number (the hex string after '''git: mojo/''' and the '''[r]'''). Also do an '''export_users''' (belt and braces).
<b>export_users</b>


With this revision of the code, the users.v3 file will be replaced with users.v3j. On restarting the node, the users.v3j file will be generated from the users.v3 file. The users.v3 file is not changed.
With this revision of the code, the users.v3 file will be replaced with users.v3j.
The process of generation will take up to 30 seconds depending on the number of users in your file, the speed of your disk(s) and the CPU speed (probably in that order. On my machine, it takes about 5 seconds, on an RPi??? This is a reversable change. Simply checkout the revision you noted down before ('''git checkout <reversion>''') and email me should anything go wrong.
 
On restarting the node, the users.v3j file will be generated from the users.v3 file. The users.v3 file is not changed.
The process of generation will take up to 30 seconds depending on the number of users in your file, the speed of your disk(s) and the CPU speed (probably in that order. On my machine, it takes about 5 seconds, on an RPi??? This is a reversable change. Simply checkout the revision you noted down before (<b>git checkout <reversion></b>) and email me should anything go wrong.


Part of this process may clear out some old records or suggest that there might errors. DO NOT BE ALARM. This is completely normal.
Part of this process may clear out some old records or suggest that there might errors. DO NOT BE ALARM. This is completely normal.
Line 156: Line 164:
This change not only should make the rebuilding of the users file (much) less likely, but tests suggest that access to the users file is about 2.5 times quicker. How much difference this makes in practise remains to be seen.
This change not only should make the rebuilding of the users file (much) less likely, but tests suggest that access to the users file is about 2.5 times quicker. How much difference this makes in practise remains to be seen.


When you done this, in another shell, run '''/spider/perl/create_dxsql.pl'''. This will convert the DXQSL system to dxqsl.v1j (for the '''sh/dxqsl <call>''' command). When this is finished, run '''load/dxqsl''' in a console (or restart the node, but it isn't necessary).
When you done this, in another shell, run <b>/spider/perl/create_dxsql.pl</b>. This will convert the DXQSL system to dxqsl.v1j (for the <b>sh/dxqsl <call></b> command). When this is finished, run <b>load/dxqsl</b> in a console (or restart the node, but it isn't necessary).
 
This has been done to remove Storable - completely - from active use in DXSpider. I have started to get more reports of user file corruptions in the last year than I ever saw in the previous 10. One advantage of this is that change is that user file access is now 2.5 times faster. So things like <b>export_users</b> should not stop the node for anything like as long as the old version.


This has been done to remove Storable - completely - from active use in DXSpider. I have started to get more reports of user file corruptions in the last year than I ever saw in the previous 10. One advantage of this is that change is that user file access is now 2.5 times faster. So things like '''export_users''' should not stop the node for anything like as long as the old version.
On the subject of export_users. Once you are happy with the stability of the new version, you can clean out all your user_asc.* files (I'd keep the <b>user_asc</b> that you just created for emergencies).


On the subject of export_users. Once you are happy with the stability of the new version, you can clean out all your user_asc.* files (I'd keep the '''user_asc''' that you just created for emergencies). The modern equivalent of this file is now called '''user_json''' and can used in exactly the same way as '''user_asc''' to restore the '''users.v3j''' file (stop the node; cd /spider/local_data; perl user_json; start the node).
The modern equivalent of this file is now called <b>user_json</b> and can used in exactly the same way as <b>user_asc</b> to restore the <b>users.v3j</b> file. [[View Restoring the user DB]]

Latest revision as of 14:38, 16 February 2023

There are the notes for upgrading to the mojo branch. PLEASE NOTE THERE HAVE BEEN CHANGES FOR all MOJO BRANCH USERS. See APPENDIX(i) at the end of this document.

The BIG TICKET ITEM in this branch is that long lived commands such as sh/dx and commands that poll external internet resources now don't halt the flow of data through the node. I am also using a modern, event driven, web socket "manager" called Mojolicious which is considerably more efficient than what went before. Using Mojolicious also brings the tantalising possibility of grafting on a web frontend, as it were, to the "side" of a DXSpider node. Apart from anything else there will, almost certainly, need to be some internal data structure reorganisation before a decent web frontend could be constructed.

Upgrading is not for the faint of heart. There is no installation script (but there will be) so, for the time being, you need to do some manual editing. Also, while there is a backward path, it will involve moving various files from their new home (/spider/local_data), back to where they came from (/spider/data).

Prerequisites:

A supply of good, strong tea - preferably in pint mugs. A tin hat, stout boots, a rucksack with survival rations and a decent miners' lamp might also prove comforting. I enclose this link: http://www.noswearing.com/dictionary in case you run out of swear words.

An installed and known working git based installation. Mojo is not supported under CVS or installation from a tarball.

 perl 5.10.1, preferably 5.14.1 or greater

This basically means running ubuntu 12.04 or later (or one of the other linux distros of similar age or later). The install instructions are for debian based systems. IT WILL NOT WORK WITHOUT A "MODERN" PERL. Yes, you can use bleadperl if you know how to use it and can get it to run the node under it as a daemon without resorting the handy URL supplied above. Personally, I wouldn't bother. It's easier and quicker just to upgrade your linux distro. Apart from anything else things like ssh ntpd are broken on ALL older systems and will allow the ungodly in more easily than something modern.

Install cpamminus:

sudo apt-get install cpanminus

or

wget -O - https://cpanmin.us | perl - --sudo App::cpanminus

or

sudo apt-get install curl
curl -L https://cpanmin.us | perl - --sudo App::cpanminus

You will need the following CPAN packages:

If you are on a Debian based system (Devuan, Ubuntu, Mint etc) that is reasonably new (I use Ubuntu 18.04 and Debian 10/11) then you can simply do:

sudo apt-get install libev-perl libmojolicious-perl libjson-perl libjson-xs-perl libdata-structure-util-perl libmath-round-perl libnet-cidr-lite-perl

or on Redhat based systems you can install the very similarly (but not the same) named packages. I don't know the exact names but using anything less than Centos 7 is likely to cause a world of pain. Also I doubt that EV and Mojolicious are packaged for Centos at all.

If in doubt or it is taking too long to find the packages you should build from CPAN. Note: you may need to install the essential packages to build some of these. At the very least you will need to install 'make' (sudo apt-get install make) or just get everything you are likely to need with:

sudo apt-get install build-essential
sudo cpanm EV Mojolicious JSON JSON::XS Data::Structure::Util Math::Round Net::CIDR::Lite
# just in case it's missing (top, that is)
sudo apt-get install procps

Please make sure that, if you insist on using operating system packages, that your Mojolicious is at least version 7.26. Mojo::IOLoop::ForkCall is NOT LONGER IN USE! The current version at time of writing is 8.36.

Login as the sysop user.

Edit your /spider/local/DXVars.pm so that the bottom of the file is changed from something like:

-- old --

 # the port number of the cluster (just leave this, unless it REALLY matters to you)
 $clusterport = 27754;

 # your favorite way to say 'Yes'
 $yes = 'Yes';

 # your favorite way to say 'No'
 $no = 'No';

 # the interval between unsolicited prompts if not traffic
 $user_interval = 11*60;

 # data files live in
 $data = "$root/data";

 # system files live in
 $system = "$root/sys";

 # command files live in
 $cmd = "$root/cmd";

 # local command files live in (and overide $cmd)
 $localcmd = "$root/local_cmd";

 # where the user data lives
 $userfn = "$data/users";

 # the "message of the day" file
 $motd = "$data/motd";

 # are we debugging ?
 @debug = qw(chan state msg cron );

-- to this --

 # the port number of the cluster (just leave this, unless it REALLY matters to you)
 $clusterport = 27754;

 # your favorite way to say 'Yes'
 $yes = 'Yes';

 # your favorite way to say 'No'
 $no = 'No';

 # this is where the paths used to be which you have just removed

 # are we debugging ?
 @debug = qw(chan state msg cron );

There may be other stuff after this in DXVars.pm, that doesn't matter. The point is to remove all the path definitions in DXVars.pm. If this isn't clear to you then it would be better if you asked on dxspider-support for help before attempting to go any further.

One of the things that will happen is that several files currently in /spider/data will be placed in /spider/local_data. These include the user, qsl and usdb data files, the band and prefix files, and various bad data files. I.e. everything that is modified from the base git distribution.

Now run the console program or telnet localhost and login as the sysop user.

export_users
bye

as the sysop user:

sudo service dxspider stop

or

sudo systemctl stop dxspider

having stopped the node:

mkdir /spider/local_data
git reset --hard
git pull --all
git checkout --track -b mojo origin/mojo

if you have not already done this:

sudo ln -s /spider/perl/console.pl /usr/local/bin/dx
sudo ln -s /spider/perl/*dbg /usr/local/bin

Now in another window run:

watchdbg

and finally:

sudo service dxspider start

or

sudo service systemctl start dxspider


APPENDIX(i)

Before shutting down to do the update, do a

sh/ver

and take node of the current git revision number (the hex string after git: mojo/ and the [r]).

DXSpider v1.55 (build 249 git: master/2fc6c64f[r]) usando perl v5.30.3 en Linux
Derechos de autor (c) 1998-2023 Dirk Koopman G1TLH

Also do an

export_users

With this revision of the code, the users.v3 file will be replaced with users.v3j.

On restarting the node, the users.v3j file will be generated from the users.v3 file. The users.v3 file is not changed. The process of generation will take up to 30 seconds depending on the number of users in your file, the speed of your disk(s) and the CPU speed (probably in that order. On my machine, it takes about 5 seconds, on an RPi??? This is a reversable change. Simply checkout the revision you noted down before (git checkout <reversion>) and email me should anything go wrong.

Part of this process may clear out some old records or suggest that there might errors. DO NOT BE ALARM. This is completely normal.

This change not only should make the rebuilding of the users file (much) less likely, but tests suggest that access to the users file is about 2.5 times quicker. How much difference this makes in practise remains to be seen.

When you done this, in another shell, run /spider/perl/create_dxsql.pl. This will convert the DXQSL system to dxqsl.v1j (for the sh/dxqsl <call> command). When this is finished, run load/dxqsl in a console (or restart the node, but it isn't necessary).

This has been done to remove Storable - completely - from active use in DXSpider. I have started to get more reports of user file corruptions in the last year than I ever saw in the previous 10. One advantage of this is that change is that user file access is now 2.5 times faster. So things like export_users should not stop the node for anything like as long as the old version.

On the subject of export_users. Once you are happy with the stability of the new version, you can clean out all your user_asc.* files (I'd keep the user_asc that you just created for emergencies).

The modern equivalent of this file is now called user_json and can used in exactly the same way as user_asc to restore the users.v3j file. View Restoring the user DB