I'm using Google Drive, mounting it automatically after logged in in online account, but I've read that Google Drive mounted doesn't keep a offline copy of the drive so any access take a long time. Is there a way to force Google Drive to be cache offline?
I have both Ubuntu 16.04 LTS and Ubuntu 18.04 LTS installed.
I want to search for documentation and read it off-line (as it is included to the installed packages in /usr/share/help
and other locations). As far I can understand it has three formats:
- HTML -
.page
(dpkg -S .page | grep -E /usr/share/help | awk '{print $1}' | sort | uniq | wc -l
returns 12 results) - DocBook -
.docbook
(dpkg -S .docbook | grep -E /usr/share/help | awk '{print $1}' | sort | uniq | wc -l
returns 21 results) - ManPage (many from
manpath
=/usr/local/man:/usr/local/share/man:/usr/share/man
)
These files are rendered by Yelp program (from yelp
package) with executables yelp
and gnome-help
.
Notes:
- here on AskUbuntu was similar question about ScrollKeeper database (provided by
rarian-compat
package) without answer; there is a bug 726439 named "Search for document returns an Unknown error: 'URI xref:search=' could not be parsed" on LauchPad.net (from 2011 year) about broken(?) search in Yelp. If I press CtrlS to search
searchterm
in current version of Yelp it still returnsUnknown Error
The URI ‘xref:search=searchterm’ could not be parsed.
Of course I can run the grep
over all these files, but is it possible to run such search from Yelp?
Imagine that, I have a fresh Ubuntu 18.04 installed. But there are some default packages or services that connects to the Internet without my consent. How to get rid of them completely or make them offline in an effective way?
As far as I now, they are:
popularity-contest
- The Ubuntu Popularity Contest (or popcon, in short) gathers statistics determining which packages are the most popular with Ubuntu users.
unattended-upgrades
- Automatically upgrades computer with the latest security updates.
apt-daily
- Updates apt index every day automatically.
snapd
- Updates snap packages automatically.
update-manager
- Checks for release updates.
I could be missing something. Please post if you know...
The purpose: I want to have full control over the Internet, as in my case, Internet is limited. So, I wish to update or upgrade manually rather than automatically.
Similar questions have been asked already but none of them have solved my problem:
I need to install a package on a standalone Linux box, specifically kdbg. Now I tired the command
sudo apt-get install --download-only kdbg
on a box connected to the internet, but it only downloads the package and dependencies that I don't have installed. Some of those dependencies (that command downloaded 117 total packages) have sub dependencies, and those sub dependencies have even more dependencies and I'm going down a rabbit hole trying to fish those packages out of the repo.
Now I tried using a couple of other commands that supposedly will download all dependencies, even the ones I have installed. I've tried
apt-get download PACKAGE && apt-cache depends -i PACKAGE | awk '/Depends:/ {print $2}' | xargs apt-get download
and
apt-get download $(apt-rdepends <package>|grep -v "^ ")
.
Command one only downloads the direct dependencies, like the ones you'd find on packages.ubuntu.com if you were to search kdbg, and command two gives me the error message:
Can't select candidate version for package <package> as it has no candidate
for several different packages.
So, to restate my question, is there a way for me to download kdbg, all of its dependencies, all of those dependencies' dependencies, so on and so forth? Or perhaps I am using one of the above commands incorrectly?
Thanks in advance.
One day, I noticed that my Internet is being used. After using sudo nethogs
I found that package called snapd
is downloading something from internet regularly without my consent. The only snap program that I have installed is VLC.
So, how to see what things do/did snapd
upload or download or how to make snapd
run in offline mode?