Error Failed To Mount Boot Environment
Contents |
» Getting Rid of Pesky Live Upgrade Boot Environments By User12611829-Oracle on May 21, 2009 As we discussed error failed to mount abe earlier, Live Upgrade can solve most of the problems associated with arch failed to mount boot patching and upgrading your Solaris system. I'm not quite ready to post the next installment in the error: unable to determine the configuration of the current boot environment LU series quite yet, but from some of the comments and email I have received, there are two problems that I would like to help you work ludelete force around. Oh where oh where did that file system go ? One thing you can do to stop Live Upgrade in its tracks is to remove a file system that it thinks another boot environment needs. This does fall into the category of user error, but you are more likely to run into this in
Solaris 10 Ludelete Fails
a ZFS world where file systems can be created and destroyed with great ease. You will also run into a varient of this if you change your zone configurations without recreating your boot environment, but I'll save that for a later day. Here is our simple test case: Create a ZFS file system. Create a new boot environment. Delete the ZFS file system. Watch Live Upgrade fail. # zfs create arrakis/temp # lucreate -n test Checking GRUB menu... System has findroot enabled GRUB Analyzing system configuration. Comparing source boot environment vC OPS VMware SDDC VMware vSphere 5.x vSphere Network vSphere DS vShield Suite VMware vSphere 6.0 VSAN Tutorials VXVM-Training ZFS-Tutorials NetApp cDot LVM Cisco UCS LDOM Oracle VM for x86 VXVM Man Page Oracle Solaris Oracle SuperCluster Exadata Solaris 10 solaris ludelete Live upgrade Solaris Migrations Zones ZFS VXVM VCS Solaris11 Solaris 11.2 IPS-Solaris11 Solaris Cluster Solaris kernel zones Solaris ludelete example Networking Redhat-Linux How to Articles Q&A Networking RHEL7 Download E-Books Virtualization Trend Hadoop & Big data Cloud Networking Tips & Tricks Advertise Contact Me Patners Vembu Home / Live https://blogs.oracle.com/bobn/entry/getting_rid_of_pesky_live upgrade / How to Cleanup the Liveupgrade on Solaris ? How to Cleanup the Liveupgrade on Solaris ? Live upgrade, Solaris 10 Again I back to one of my favorite topic.Yes,its liveupgrade. As we all know liveupgrade is newly introduced inSolaris10 and it has great feature if you use along with ZFS filesystem. Solaris Live upgrade made the OS http://www.unixarena.com/2013/12/how-to-cleanup-liveupgrade-on-solaris.html patching is more simple and it will save lot of time.But liveupgrade has lot of bugs and it will not work in all the setups and zone's configuration.Check out various liveupgrade issues here.Here we will see how to cleanup the alternative bootenvironmentand current boot environmentwithoutimpactingthe system. Once the liveupgrade is messed up, then its better to start from the beginning.Also not all the time ludelete will work to destroy the alternative BE.Here we will see the tricks to clean up the BE's. Note:InSolaris11 , liveupgrade has been re-designed with some new commands.Here is the current configuration of my global zone. bash-3.00# lustatusBoot Environment Is Active Active Can CopyName Complete Now On Reboot Delete Status-------------------------- -------- ------ --------- ------ ----------OLDBE yes yes yes no -NEWBE yes no no yes -bash-3.00# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / native shared 1 u1 running /export/zones/u1 native sharedbash-3.00# zfs list -t snapshotNAME USED AVAIL REFER MOUNTPOINTrpool/ROOT/root@NEWBE 1.70M - 3.50G -rpool/export/zones/u1@NEWBE 1.21M - 484M -bash-3.00# zonenameglobalbash-3.00# 1.How to active the current boot environment Joined: Dec 6, 2014 Messages: 42 Thanks Received: 11 Trophy Points: 11 I'm currently unable to update my FreeNAS install any longer, and an update is queued. https://forums.freenas.org/index.php?threads/updates-failing-on-freenas-9-3.26828/ I tried applying the updates via the web UI, but an error pops up in the web UI (it disappears too quickly for me to capture, but the same text gets logged). I get the following errors in the system log when I do so: Code:Jan 17 13:23:09 callisto updated.py: [freenasOS.Update:699] Unable to mount boot-environment FreeNAS-9.3-STABLE-201501162230 Jan 17 13:23:17 callisto updated.py: [freenasOS.Update:741] Update got exception during update: Unable to failed to mount boot-environment FreeNAS-9.3-STABLE-201501162230 Jan 17 13:23:18 callisto manage.py: [middleware.exceptions:38] [MiddlewareError: Unable to mount boot-environment FreeNAS-9.3-STABLE-201501162230] Anyone have any idea what is going on here? Show : My System Specs FreeNAS version: FreeNAS-9.3-STABLE-201501241715 Motherboard: ASRock C2750D4I Mini-ITX Server board CPU: Intel Atom C2750 2.4 GHz 8 cores (Avoton) RAM: Crucial CT2KIT102472BD160B 2x 8GB PC3-12800 1600MHz ECC RAM Disks: 6x WD Red 3TB WD30EFRX-68E SATA (via LSI 9211-8i SAS failed to mount controller, crossflashed to P20 IT) Power supply: SilverStone Tek ST45SF-G 450W Fully Modular SFX 80+ Gold Certified Case: SilverStone Tek DS380B (currently without SAS backplane) Boot device: Kingston DataTraveler SE9 64GB USB Flash Drive Hostname: callisto demon, Jan 17, 2015 #1 sef FreeNAS Experienced iXsystems Joined: Sep 21, 2012 Messages: 181 Thanks Received: 15 Trophy Points: 26 The update for 20150115Luconfig Error Could Not Analyze Filesystem Mounted At