Issue:- Booting into recovery mode, after post installation
Tried nomodeset parameters added in grub, able to install and post installation of NVIDIA-440 , machine booting in recovery mode.
Note:- Ubuntu 20.04 LTS worked without any issues.
Env:
- Dell Alienware Aurora R9
- Ubuntu 18.04.4 LTS
- 5.3.0-28-generic
- dosfstools 4.1-1
- UEFI - Dual boot, secure boot disabled, SATA to AHCI
- Bios vers - 1.0.7
$ lspci | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1)
02:00.0 VGA compatible controller: NVIDIA Corporation TU104 [GeForce RTX 2080 SUPER] (rev a1)
Below is the journal logs, for the failure of machine boots into emergency mode. For some reasons, while unpacking the initramfs, it's unable to mount the EFI partition and dropping into emergency mode. And it's points bad superblock
Might-be this is a bug in the 18.04. At the same time 20.04 worked without any issues.
Jun 07 17:08:32 test-Alienware-Aurora-R9 kernel: Initramfs unpacking failed: Decoding failed
Jun 07 17:08:32 test-Alienware-Aurora-R9 kernel: Freeing initrd memory: 48284K
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Found device PM981a NVMe SAMSUNG 2048GB ESP.
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Starting File System Check on /dev/disk/by-uuid/CCF3-E7D6...
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Started File System Check Daemon to report status.
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Found device ST2000DM008-2FR102 4.
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Activating swap /dev/disk/by-uuid/5cfe5c2d-6dd7-46b3-9fce-0e08b35b26cb...
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Activated swap /dev/disk/by-uuid/5cfe5c2d-6dd7-46b3-9fce-0e08b35b26cb.
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Reached target Swap.
Jun 07 17:08:33 test-Alienware-Aurora-R9 kernel: Adding 62499836k swap on /dev/sda4. Priority:-2 extents:1 across:62499836k FS
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd-fsck[623]: fsck.fat 4.1 (2017-01-24)
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd-fsck[623]: /dev/nvme0n1p1: 389 files, 36496/74752 clusters
Jun 07 17:08:33 test-Alienware-Aurora-R9 systemd[1]: Started File System Check on /dev/disk/by-uuid/CCF3-E7D6.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Mounting /boot/efi...
Jun 07 17:08:34 test-Alienware-Aurora-R9 mount[669]: mount: /boot/efi: wrong fs type, bad option, bad superblock on /dev/nvme0n1p1, missing codepage or helper program, or other error.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: boot-efi.mount: Mount process exited, code=exited status=32
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: boot-efi.mount: Failed with result 'exit-code'.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Failed to mount /boot/efi.
Jun 07 17:08:34 test-Alienware-Aurora-R9 kernel: FAT-fs (nvme0n1p1): IO charset iso8859-1 not found
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Dependency failed for Local File Systems.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Dependency failed for Clean up any mess left by 0dns-up.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: dns-clean.service: Job dns-clean.service/start failed with result 'dependency'.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: local-fs.target: Job local-fs.target/start failed with result 'dependency'.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: local-fs.target: Triggering OnFailure= dependencies.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Starting Set console font and keymap...
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Started Stop ureadahead data collection 45s after completed startup.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Reached target Timers.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Closed Syslog Socket.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Starting Set console scheme...
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Reached target Paths.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Reached target Sockets.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Started Emergency Shell.
Jun 07 17:08:34 test-Alienware-Aurora-R9 systemd[1]: Reached target Emergency Mode.
We are able to boot even after passing nomodeset
in GRUB.
This is a freshly installed machine, and tried with different kernels while switching from higher to lower. In all the case, same result.
This machine consist of two drives SSD and HDD. I suspect it might be caused issue with hardware dependency?
$ cat /proc/scsi/scsi
Attached devices:
Host: scsi0 Channel: 00 Id: 00 Lun: 00
Vendor: ATA Model: ST2000DM008-2FR1 Rev: 0001
Type: Direct-Access ANSI SCSI revision: 05
Issue fixed.
Marked incorrect disk for the boot loader installation which caused the machine not boot properly. Due to this EFI mounts from secondary disk causing the booting issue.
Marked the correct disk during installation by choosing bootloader fixed the issue
Notes:-for dual boot in Aurora R9
For Ubuntu 18.04 / 16.04 below is the fix from my side
From BIOS
Disable secure boot Storage AHCI mode
Remove "quiet" and "splash" and then add "nomodeset" from grub and install the OS. Mark correct disk from the bootloader list in the custom partitioning page
On First boot, remove "quiet" and "splash" and then add "nomodeset" and install the NVIDIA proprietary drivers preferably latest one ( 440 ) . Then reboot, it will work