$ tar jxf ./libgphoto2-2.5.4.tar.bz2 $ ./configure --prefix=/usr/local $ make $ sudo make install
#lsusb # sh -c '/usr/local/lib/libgphoto2/print-camera-list udev-rules mode 0660 group plugdev > /etc/udev/rules.d/45-libgphoto2.rules' # gedit /etc/udev/rules.d/45-libgphoto2.rules
Go to the bottom of the file. Before the end where it says: LABEL=”libgphoto2_rules_end” Add Code: ATTRS{idVendor}==”VENDORID”, ATTRS{idProduct}==”PRODUCTID”, MODE=”0660”, GROUP=”plugdev”
Save the rules file. 7) Reboot the computer. 8) Turn on camera. Everything should work. (Must use Kodak cable for Easychare camera)
GUI interface at applications / graphics / Gtkam digital camera browser
2016-05-28
Plug camera in using special “Kodak” USB cord. The cord has the label “Kodak” and a ballast.
Kodak EasyShare C123
Turn camera on. Use file manager to transfer files
#apt-get install xscreensaver (gnome has its own much inferior screensaver) configure the “ripples” screensaver to pull images from the Pictures directory
Previously I was working with Jekyll, the most popular static web generator. As I have an interest in Nodejs (and not in Ruby), I wanted to use a site generator that would give me practice with Node. Hexo was the most popular according to staticgen. I am working with the default “Landscape” theme.
$hexo new draft mypost in the lostnation directory
will create the post in the _drafts subdirectory.
File title will be "mypost.md"
There will be a subdirectory "mypost" that will hold other content (figures, data etc.)
When ready: $hexo publish mypost Will create 2016-12-06-mypost directory and 2016-12-06-mypost.md file.
$hexo generate will process all files into html
$hexo serve will start a local web server. The local site can be inspected at http://0.0.0.127:4000
$hexo deploy will upload to your web server. FTP parameters are in _config.yml
Hexo deploy
For deploy to work: update ~/.netrc with site / username / password (for manual ftp diagnosis) chmod 600 ~/.netrc
hexo deploy will fail silently without debugging info. Manually ftp to diagnose. Better yet, avoid it entirely. Use rsync, which is faster and can perform incremental uploads, something hexo deploy can’t do.
get rid of the extra comments link in /home/mbc/lostnation/themes/landscape/layout/_partial/article.ejs comment out:
Templates
Templates are found in the scaffolds directory. Modify to include categories, comments: true etc. Anything included in the template will end up in the draft (or post).
prefer to use my favicon across the entire site, so I’ve linked to the favicon in my theme config. I use Landscape, the default theme included with Hexo.
Sometimes e.g. during the migration from Jekyll to Hexo, I need to bulk process md files. Set up process, backup, source, and destination directories. Below are some useful sed commands for bulk processing
sed -i 's/{% highlight r %}/{% codeblock lang:r %}/g' *
sed -i 's/{% highlight text %}/{% codeblock lang:bash %}/g' *
sed -i 's/{% endhighlight %}/{% endcodeblock %}/g' *
sed -i 's/{% endhighlight %}/{% endcodeblock %}/g' *
sed -i 's/{% highlight sql linenos %}/{% codeblock lang:sql %}/g' *
sed -i 's/{% highlight lisp linenos %}/{% codeblock lang:lisp %}/g' *
sed -i 's/{% highlight r linenos %}/{% codeblock lang:r %}/g' *
sed -i 's/{% highlight html linenos %}/{% codeblock lang:html %}/g' *
sed -i 's/{% highlight elisp linenos %}/{% codeblock lang:elisp %}/g' *
sed -i 's/\/figs\//\/lnsDFoKytr\/figs\//g' *
sed -i 's/{{site.baseurl}}//g' *
sed -i 's/^permalink:.*//g' *
sed -n '/highlight/p' *
sed -n '/playstyle AA^{&̲#39;}/p' *
grep -nwl "playstyle " **/*.md
Hexo has horrendous error messages - no indication of the file giving the error. Note also that md files in _drafts will be processed, so any errors there need attention.
Sometimes a keyword will pop out of the error output. Search across files with:
grep -Ril “highlight” ./
R: recursive i: ignore case l: show file name, not error ./: where to start looking
I want to use sudo-free node with Hexo and blockchain related scripts. Do not use the apt-get method of installation on Debian as that provides a “nodejs” executable rather than “node” which can be problematic (at least with Hexo). I will follow the installation instructions at nearform. There is also a good tutorial at sitepoint.
#apt-get install curl ;;if needed $curl https://raw.githubusercontent.com/creationix/nvm/v0.25.0/install.sh | bash $nvm install stable ;;; or choose a particular version ## restart terminal $nvmalias default stable ;; set stable as the default
;;to see where executable reside
1
npm config get prefix
use ~/.npm for local (global) installation always install with the -g option (global, but with my setup it will be under my home dir, no sudo needed) the -save option adds the package to the application specific package list, so it will automatically be installed when the package is moved.
(package (name"guile-json") (version"4.3.2") (home-page"https://github.com/aconchillo/guile-json") (source (origin (method url-fetch) (uri (string-append"mirror://savannah/guile-json/guile-json-" version ".tar.gz")) (sha256 (base32 "0255c7f053z4p9mqzhpxwbfx3y47j9nfvlgnm8xasdclyzmjl9y2")))) (build-system gnu-build-system) (native-inputs `(("pkg-config" ,pkg-config) )) (inputs `(("guile" ,guile-3.0))) (synopsis"JSON module for Guile") (description "Guile-JSON supports parsing and building JSON documents according to the specification. These are the main features: @itemize @item Strictly complies to @uref{http://json.org, specification}. @item Build JSON documents programmatically via macros. @item Unicode support for strings. @item Allows JSON pretty printing. @end itemize\n")
;; Version 1.2.0 switched to GPLv3+ (from LGPLv3+). (license gpl3+))
To see all the files the package installed onto your system, do this: dpkg-query -L
To see the files a .deb file will install dpkg-query -c <package_name.deb>
To see the files contained in a package NOT installed, do this once (if you haven’t installed apt-file already: sudo apt-get install apt-file sudo apt-file update
;;----install sources----- ;; Now, first get the source package: apt-get source foo ;;and change to the source tree: cd foo-* ;;Then install needed build-dependencies (if any): sudo apt-get build-dep foo ;;Then create a dedicated version of your own build (so that you won't get confused later when Debian itself releases a new version) dch -l local 'Blah blah blah' ;;And finally build your package debuild -us -uc ;;If everything worked out fine, you should now be able to install your package by running sudo dpkg -i ../*.deb $ tar zxf file.tar.gz $ tar zxf file.tgz $ tar jxf file.tar.bz2 $ tar jxf file.tbz2 $ tar xf file.tar.xz ;;Now change directory $ ls $ cd path-to-software/ # ./configure # make # make install
dpkg-source -x foo_version-revision.dsc
will extract the package into a directory called foo-version.
If you want just to compile the package, you may cd into foo-version directory and issue the command
dpkg-buildpackage -rfakeroot -b
to build the package (note that this also requires the fakeroot package), and then
In the past I have used Ubuntu One until it went out of business, then ownCloud, until it went out of business. I have been looking for a method to perform bidirectional backup to a directory on my Hostgator shared server. Osync initially looked promising, but SSHs into the server too many times, exceeding the 12 allowable attempts per IP per 90 seconds imposed by Hostgator. I fell back to rsync, and had to hack the “bidirectional” feature in an unsatisfactory way.
Rsync
See here for setting up SSH keys. The command to upload from a local directory ~/syncd/ to Hostgator, preserving directory structure:
1 2 3 4 5
cd ~/syncd/ rsync -avzP --delete --rsh='ssh -p2222' /home/mbc/syncd mbcladwell@123.123.123.123:/home2/mbcladwell/public_html/
-q quiet - use with cron -v verbose -r recursive -R relative -t times must be used to transfer only modified files in future backups -a equivalent to -rlptgoD -o owner -g group -p permissions -D devices - transfer char and block device info -z compress -P same as -partial -progress -partial retain partially transferred files -progress show progress (should also use -v )
Create 2 batch files, one for upload and one for download. I will use the upload syntax to register a cron job. The job will run every 15 minutes and back up to the cloud.
If I use a second computer, manually run rsync-download.sh upon login. When returning to the primary computer, again manually run rsync-download.sh to capture any changes made on the auxilliary machine. As a reminder, I created a batch file that will prompt me on login. Add a new autostart with the Applications / Settings / Settings manager / Session and startup / Application autostart dialog with the command: xfce4-terminal -e “bash /home/pl/check-for-download.sh” –hold
check-for-download.sh
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
#!/bin/bash
## xfce4-terminal -e "bash /home/pl/check-for-download.sh" --hold ## by default input assigned to the variable REPLY echo"Run download script? [Y|n]:" read
if [[ ("$REPLY" == "Y") || ( "$REPLY" == "" ) ]]; then echo"starting download!" ./rsync-download.sh else echo"Download canceled." fi
Backup to USB
I don’t want private keys in the cloud so I backup personal directories and files to a local USB.
$ git clone https://github.com/deajan/osync $ cd osync # sh install.sh [root@xps osync]# ./install.sh Created directory [/etc/osync]. Copied osync.sh to [/usr/local/bin]. Copied osync-batch.sh to [/usr/local/bin]. Copied ssh_filter.sh to [/usr/local/bin]. Created osync-srv service in [/lib/systemd/system] and [/etc/systemd/user]. Can be activated with [systemctl start osync-srv@instance.conf] where instance.conf is the name of the config file in /etc/osync. Can be enabled on boot with [systemctl enable osync-srv@instance.conf]. In userland, active with [systemctl --user start osync-srv@instance.conf]. osync installed. Use with /usr/local/bin/osync In order to make install statistics, the script would like to connect to http://instcount.netpower.fr?program=osync&version=1.2-beta3&os=Linux%20unknown%20unknown%20GNU%2FLinux No data except those in the url will be send. Allow [Y/n]
As mentioned above, osync repeatedly confirms SSH accessibility, exceeding the login limits allowed by my provider. Insert a sleep statement in the CheckConnectivityRemoteHost function to slow the script down. Hostgator allows 12 SSH logins per 90 seconds, with a fresh allocation of logins every 90 seconds.
osync.sh
1 2 3 4 5 6 7
function CheckConnectivityRemoteHost { sleep 10 local retval