kotlin – How to show a notification while an Android Backup is running?

If you have used WhatsApp, you’ll know that whenever a chat backup is running, a notification appears to let the user know that a backup is running, along with the progress and everything.
Example

For my app, I am using a custom BackupAgentHelper, this one is used to back up the user’s Room database file, as well as the SharedPreferences:

class CustomAgent : BackupAgentHelper() {

    val DB_NAME = "notesDB"

    val DB_BACKUP_KEY = "dbBackup"
    val SHARED_PREFS_KEY = "prefsBackup"

    override fun onCreate() {
        super.onCreate()
        //Backup note's database file
        val dbHelper = FileBackupHelper(this, DB_NAME)
        addHelper(DB_BACKUP_KEY, dbHelper)

        //Backup SharedPreferences
        val prefHelper = SharedPreferencesBackupHelper(this)
        addHelper(SHARED_PREFS_KEY, prefHelper)
    }


    override fun getFilesDir(): File {
        val path = getDatabasePath(DB_NAME)
        return path.parentFile
    }
}

In my case, I would want to show a notification while the backup is running and dismiss it automatically when it has finished. Also, is there a way to send another notification but when the Backup quota is exceeded? As far as I know, Google only allows backups of up to 25MB.

Can I restore a GDrive WhatsApp backup if I skip it?

I want to reset my phone and use a brand new WhatsApp for a bit, then later on uninstall/reinstall WhatsApp with the latest backup. Is this possible, or will skipping the restore cause the previous ones to be deleted?

backup – How to back-up a folder every 15 minutes without taking up more space than needed?

I’m incrementally backing up my coding project with Duplicity and am finding it’s consuming storage faster than expected.

The folder I’m backing up consists mostly of gifs, images, and libraries that take up most of the space and are only occasionally changed, and my script files which take up very little space but need to be backed up every 15 minutes or so to avoid losing hours of progress if data is accidentally deleted.

Since Duplicity doesn’t store the files whole but has encrypted slices that need to be restored, I assumed that means it’s only copying the parts of the folders that need to be changed into a snapshot and algorithmically piecing it together?

If not, is there a tool that would be more efficient for doing this? Ideally I’m trying to set up a system that backs up the entire folder to a USB drive once per day, and does a more efficient sync every 15 minutes so I can easily go back if I accidentally delete everything.

arm – how to backup and minify Raspberry Pi with Windows 10?


Your privacy


By clicking “Accept all cookies”, you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy.




backup – MySQL Dump only views, triggers, events and procedures

how are you?

I need to generate a file with all triggers, events, procedures from a server’s databases and tried the two commands below

  • mysqldump -u root -p –all-databases –host=127.0.0.1 –no-data
    –no-create-db –no-create-info –routines –triggers –skip-comments — skip-opt –default-character-set=utf8 -P3306 > E:db_objects_no_create.sql

    mysqldump -u root -p –all-databases –host=127.0.0.1 –no-data
    –no-create-db –routines –triggers –skip-comments –skip-opt –default- character-set=utf8 -P3306 > E:db_objects.sql

In the first one, the output of the file does not extract the creation codes correctly, they are commented out.

In the second the codes are done correctly however the information of the table creation code is also exported.

Does anyone know how to do this export without the table creation code being generated in the output file?

Do you value your data? Get the fastest and stable Acronis Backup Cloud now!

Acronis Backup Cloud ( ABC ) is the world’s easiest and fastest backup solution for protecting your Linux systems.

Acronis Backup Cloud protects more than 20 platforms and incorporates the backup industry’s most advanced anti-ransomware technology, safeguarding data and systems in any environment – physical or virtualized, on-premises or in the cloud. As a turnkey SaaSbased solution, it is as seamless to deploy as it is to manage and does not add complexity to any IT infrastructure – having almost no impact on system performance.

For more information: http://www.acronis.com/

Based on our test, Acronis is 2x faster than other backup software. Compression is really good and backup and restoration is fast. Best of all, it will not increase load on your server. Acronis can be integrated with WHM and cPanel.

Are you getting a better price from other providers? Show it to us and we will try to beat it.

********** Special Offers 4 Someone Special Like You ****************

Our offers:

100 GB of Backup

1 Agent/Server

Unmetered Bandwidth

WHM Interface

cPanel Plugin

Access To Acronis Backup Console
$14.95 per month | FREE Setup
Order Now

200 GB of Backup

1 Agent/Server

Unmetered Bandwidth

WHM Interface

cPanel Plugin

Access To Acronis Backup Console
$19.95 per month | FREE Setup
Order Now

300 GB of Backup Space

1 Agent/Server

Unmetered Bandwidth

WHM Interface

cPanel Plugin

Access To Acronis Backup Console
$29.95 per month | FREE Setup
Order Now

500 GB of Backup Space
3 Agents/Servers

Unmetered Bandwidth

WHM Interface

cPanel Plugin

Access To Acronis Backup Console
$49.95 per month | FREE Setup
Order Now

1000 GB of Backup Space
5 Agents/Servers

Unmetered Bandwidth

WHM Interface

cPanel Plugin

Access To Acronis Backup Console
$89.95 per month | FREE Setup
Order Now

—–

Visit https://jonesolutions.com/acronis-backup-cloud/ for more information and video demo.

—————————————————-

If you have any questions, please contact us via sales@jonesolutions.com

——————————————————————

Some Testimonials About Our Services:

https://www.webhostingtalk.com/showt…1#post10254881
https://www.webhostingtalk.com/showt…7#post10254767
https://www.webhostingtalk.com/showt…6#post10238426
https://www.webhostingtalk.com/showt…3#post10234413
https://www.webhostingtalk.com/showt…5#post10221175
https://www.webhostingtalk.com/showt…0#post10220890
https://www.webhostingtalk.com/showt…6#post10199546
https://www.webhostingtalk.com/showt…9#post10191859
https://www.webhostingtalk.com/showt…3#post10182883
https://www.webhostingtalk.com/showt…0#post10182800
http://www.webhostingtalk.com/showpo…6&postcount=12
http://www.webhostingtalk.com/showpo…7&postcount=13
http://www.webhostingtalk.com/showpo…05&postcount=1
http://www.webhostingtalk.com/showpo…07&postcount=3
http://www.webhostingtalk.com/showpo…49&postcount=5
http://www.webhostingtalk.com/showpo…53&postcount=6
http://www.webhostingtalk.com/showpo…5&postcount=16
http://www.webhostingtalk.com/showpo…79&postcount=4
http://www.webhostingtalk.com/showpo…42&postcount=8
http://www.webhostingtalk.com/showpo…0&postcount=35
http://xenforo.com/community/threads…9/#post-200451
http://www.webhostingtalk.com/showpo…75&postcount=1
http://www.webhostingtalk.com/showpo…0&postcount=15
http://www.webhostingtalk.com/showpo…4&postcount=22
http://www.webhostingtalk.com/showpo…4&postcount=11
http://www.webhostingtalk.com/showpo…4&postcount=11
http://www.webhostingtalk.com/showpo…6&postcount=15
http://www.webhostingtalk.com/showpo…63&postcount=7
http://www.webhostingtalk.com/showpo…74&postcount=6
http://www.webhostingtalk.com/showpo…97&postcount=1
http://www.webhostingtalk.com/showpo…60&postcount=1
http://www.webhostingtalk.com/showpo…4&postcount=15
http://www.vbulletin.org/forum/showp…7&postcount=12
http://www.webhostingtalk.com/showpost.php?p=4496870
http://www.webhostingtalk.com/showpost.php?p=4495732
http://www.webhostingtalk.com/showpo…82&postcount=4
http://www.webhostingtalk.com/showpo…71&postcount=7
http://www.webhostingtalk.com/showpo…31&postcount=3
http://www.webhostingtalk.com/showpo…7&postcount=21
http://www.webhostingtalk.com/showpo…8&postcount=10
http://webhostingtalk.com/showpost.p…04&postcount=2
http://www.webhostingtalk.com/showpo…45&postcount=2
http://www.webhostingtalk.com/showpo…66&postcount=3
http://www.webhostingtalk.com/showpo…66&postcount=4
http://webhostingtalk.com/showpost.p…29&postcount=1
http://webhostingtalk.com/showpost.p…91&postcount=5

——————————————————————

Why choose us?

1. We’ve been in the business since December 2001 and still growing. We have already proven it for 18 years now.

2. We focus on our expertise and we know we are good at it.

3. We have with us a topnotch support team guaranteed to provide you solutions on your support concerns.

4. Your data are safe with us. We protect it with our own server management expertise.

5. All our servers are monitored via www.monitormybox.com 24/7 and it will notify us if it is down.

6. Your transaction is safe with us. When it comes to billing, we use payment processors which are very secure (e.g. 2checkout and paypal). We never keep your credit card details and we respect your privacy.

7. There are other great reasons to join us, but best of all, as our valued client, we are here to support you all the way because your business is important to us.

——————————————————————

Looking for Linux Reseller Hosting? Visit https://jonesolutions.com/premium-ssd-linux-reseller/
Looking for cPanel Fully Managed KVM VPS? Visit https://jonesolutions.com/fully-managed-kvm-vps/
Looking for cPanel Fully Managed Cloud KVM VPS? Visit https://jonesolutions.com/fully-managed-cloud-ssd-vps/
Looking for cPanel Fully Managed Dedicated Servers? Visit https://jonesolutions.com/fully-mana…cated-servers/
Looking for Proactive Server Management Service? Visit https://jonesolutions.com/proactive-server-management/
Looking for Cloudlinux License? Visit https://jonesolutions.com/cloudlinux-license/
Looking for KernelCare License? Visit https://jonesolutions.com/kernelcare-license/

Thanks,

net

backup – How to locate & compare script sources across a bunch of copies of the same G. Sheet ? It’s messy!

I’m getting used to coding in Sheets, using their script editor. I’ve coded a bunch of macros over time on a few projects, but one sheet in particular I keep multiple backups of (the data gets updated and I like to have old copies to compare).

Along the way, I sometimes get lost in versions and want to go back to find a version I did earlier. Ideally using some kind of compare tool or at least side by side in 2 windows (other than just search and view all copies one by one) ?

Questions:

  • Is there a smart & fast way to compare the code of various sheets (backups of the same sheet) ?
  • Or perhaps, a script search through all sheets scripts across my Drive ? I do keep local copies of my code, so can I search for a given variable, for example I put version numbers in my scripts, say v=0.3.9 so I’d find the sheet that has this script code version ?
  • Also: Should I avoid their editor, or use something else ? I’m an experienced coder, I do use WSL (Windows Subsystem for Linux), Visual Studio Code, Atom and Notepad++ to a varying degree (was previously a java coder for many years, so I’m re-learning these newer toolset, better suited for JavaScript).

bash – Shell script to backup local files to public cloud

This is a script that I’ve written for personal use, and to educate myself. The script is a bit comment heavy to help future me remember all the details & design choices made at the time of writing.

The use case is simply to synchronize/copy files to an AWS S3 bucket using AWS CLI. There’s a separate CDK part of the project, which sets up the AWS infrastructure, but that isn’t really relevant here. There are some configuration items in a properties file, which are read, and then the script checks whether everything is in place on the AWS end, and if so, it reads through a config folder structure which folders to backup, and how (include & exclude patterns in respective files).

Going with Bash instead of a basic shell script was a deliberate choice, since this wouldn’t be run on any production server, and extreme portability wasn’t the main point here.

Folder structure of the overall project is:

- aws-infra
-- (various things here, that are out of the scope of the question)
- config
-- backup
--- Documents
---- includes.txt
--- Pictures
---- includes.txt (example at the end)
---- excludes.txt (example at the end)
--- (more files/folders following the same structure)
-- configuration.properties
- scripts
-- sync.sh

Theoretically I could’ve just run aws s3 sync on the base path, but since it’s a recursive command, and there are a lot (about 500k) of unnecessary files, it would take a lot of time to go through each of them separately.

#!/bin/bash

# Get the current directory where this file is, so that the script can
# called from other directories without breaking up.
DIR="$( cd "$( dirname "${BASH_SOURCE(0)}" )" && pwd)"

CONFIG_FOLDER="$DIR/../config"
PROP_FILE='configuration.properties'

# This is an associated array, i.e., keys can be strings or variables
# think Java HashMap or JavaScript Object
declare -A properties

# These are Bash arrays, i.e., with auto-numbered keys
# think Java or JavaScript array
declare -a includes excludes params

function loadProperties {
    local file="$CONFIG_FOLDER/$PROP_FILE"

    if (( ! -f "$file" )); then
        echo "$PROP_FILE not found!"
        return 2
    fi

    while IFS='=' read -r origKey value; do
        local key="$origKey"
        # Replace all non-alphanumerical characters (except underscore)
        # with an underscore
        key="${key//(!a-zA-Z0-9_)/_}"

        if (( "$origKey" == "#"* )); then
            local ignoreComments
        elif (( -z "$key" )); then
            local emptyLine
        else
            properties("$key")="$value"
        fi
    done < "$file"

    if (( "${properties(debug)}" = true )); then
        declare -p properties
    fi
}

function getBucketName {
    # Declare inside a function automatically makes the variable a local
    # variable.
    declare -a params
    params+=(--name "${properties(bucket_parameter_name)}")
    params+=(--profile="${properties(aws_profile)}")

    # Get the bucket name from SSM Parameter Store, where it's stored.
    # Logic is:
    # 1) run the AWS CLI command
    # 2) grab 5th line from the output with sed
    # 3) grab the 2nd word of the line with awk
    # 4) substitute first all double quotes with empty string,
    #    and then all commas with empty string, using sed
    local bucketName=$(aws ssm get-parameter "${params(@)}" | 
                       sed -n '5p' | 
                       awk '{ print $2 }' | 
                       sed -e 's/"//g' -e 's/,//g')

    properties(s3_bucket)="$bucketName"
}

function checkBucket {
    declare -a params
    params+=(--bucket "${properties(s3_bucket)}")
    params+=(--profile="${properties(aws_profile)}")

    # Direct stderr to stdout by using 2>&1
    local bucketStatus=$(aws s3api head-bucket "${params(@)}" 2>&1)
    
    # The 'aws s3api head-bucket' returns an empty response, if
    # everything's ok or an error message, if something went wrong.
    if (( -z "$bucketStatus" )); then
        echo "Bucket "${properties(s3_bucket)}" owned and exists";
        return 0
    elif echo "${bucketStatus}" | grep 'Invalid bucket name'; then
        return 1
    elif echo "${bucketStatus}" | grep 'Not Found'; then
        return 1
    elif echo "${bucketStatus}" | grep 'Forbidden'; then
        echo "Bucket exists but not owned"
        return 1
    elif echo "${bucketStatus}" | grep 'Bad Request'; then
        echo "Bucket name specified is less than 3 or greater than 63 characters"
        return 1
    else
        return 1
    fi
}

function create_params {
    local local_folder="$HOME/$1"
    local bucket_folder="s3://${properties(s3_bucket)}$local_folder"

    params+=("$local_folder" "$bucket_folder")

    if (( ${excludes(@)} )); then
        params+=("${excludes(@)}")
    fi
    
    if (( ${includes(@)} )); then
        params+=("${includes(@)}")
    fi

    params+=("--profile=${properties(aws_profile)}")

    if (( "${properties(dryrun)}" = true )); then
        params+=(--dryrun)
    fi

    if (( "${properties(debug)}" = true )); then
        declare -p params
    fi
}

# Sync is automatically recursive, and it can't be turned off. Sync
# checks whether any files have changed since latest upload, and knows
# to avoid uploading files, which are unchanged.
function sync {
    aws s3 sync "${params(@)}"
}

# Copy can be ran for individual files, and recursion can be avoided,
# when necessary. Copy doesn't check whether the file in source has
# changed since the last upload to target, but will always upload
# the files. Thus, use only when necessary to avoid sync.
function copy {
    local basePath="${params(0)}*"

    # Loop through files in given path.
    for file in $basePath; do
        # Check that file is not a folder or a symbolic link.
        if (( ! -d "$file" && ! -L "$file" )); then
            # Remove first parameter, i.e., local folder, since with
            # copy, we need to specify individual files instead of the
            # base folder.
            unset params(0)
            aws s3 cp "$file" "${params(@)}"
        fi
    done
}

function process_patterns {
    # If second parameter is not defined, then pointless to even read
    # anything, since there's no guidance on what to do with the data.
    if (( -z "$2" )); then
        return 1;
    fi

    # If the file defined in the first parameter exists, then loop
    # through its content line by line, and process it.
    if (( -f "$1" )); then
        while read line; do
            if (( $2 == "include" )); then
                includes+=(--include "$line")
            elif (( $2 == "exclude" )); then
                excludes+=(--exclude "$line")
            fi
        done < $1
    fi
}

# Reset the variables used in global scope.
# To be called after each cycle of the main loop.
function reset {
    unset includes excludes params
}

# The "main loop" that goes through folders that need to be
# backed up.
function handleFolder {
    process_patterns "${1}/${properties(exclude_file_name)}" exclude
    process_patterns "${1}/${properties(include_file_name)}" include
    
    # Remove the beginning of the path until the last forward slash.
    create_params "${1##*/}"
    
    if (( "$2" == "sync" )); then
        sync
    elif (( "$2" == "copy" )); then
        copy
    else
        echo "Don't know what to do."
    fi

    reset
}

function usage {
    cat << EOF
Usage: ${0##*/} (-dDh)

    -d, --debug   enable debug mode
    -D, --dryrun  execute commands in dryrun mode, i.e., don't upload anything
    -h, --help    display this help and exit

EOF
}

while getopts ":dDh-:" option; do
    case "$option" in
        -)
            case "${OPTARG}" in
                debug)
                    properties(debug)=true
                    ;;
                dryrun)
                    properties(dryrun)=true
                    ;;
                help)
                    # Send output to stderr instead of stdout by
                    # using >&2.
                    usage >&2
                    exit 2
                    ;;
                *)
                    echo "Unknown option --$OPTARG" >&2
                    usage >&2
                    exit 2
                    ;;
            esac
            ;;
        d)
            properties(debug)=true
            ;;
        D) 
            properties(dryrun)=true
            ;;
        h)
            usage >&2
            exit 2
            ;;
        *)
            echo "Unknown option -$OPTARG" >&2
            usage >&2
            exit 2
            ;;
    esac
done

# set -x shows the actual commands executed by the script. Much better
# than trying to run echo or printf with each command separately.
if (( "${properties(debug)}" = true )); then
    set -x
fi

loadProperties

# $? gives the return value of previous function call, non-zero value
# means that an error of some type occured
if (( $? != 0 )); then
    exit
fi

getBucketName

if (( $? != 0 )); then
    exit
fi

checkBucket

if (( $? != 0 )); then
    exit
fi

# Add an asterisk in the end for the loop to work, i.e.,
# to loop through all files in the folder.
backup_config_path="$CONFIG_FOLDER/${properties(backup_folder)}*"

# Change shell options (shopt) to include filenames beginning with a
# dot in the file name expansion.
shopt -s dotglob

# Loop through files in given path, i.e., subfolders of home folder.
for folder in $backup_config_path; do
    # Check that file is a folder, and that it's not a symbolic link.
    if (( -d "$folder" && ! -L "$folder" )); then
        handleFolder "$folder" "sync"
    fi
done

# Also include the files in home folder itself, but use copy to avoid
# recursion. Home folder & all subfolders contain over 500k files,
# and takes forever to go through them all with sync, even with an
# exclusion pattern.
# Remove the last character (asterisk) from the end of the config path.
handleFolder "${backup_config_path::-1}" "copy"

Properties file (SSM parameter name censored):

# AWS profile to be used
aws_profile=personal

# Bucket to sync files to
bucket_parameter_name=(my_ssm_parameter_name)

# Config folder where backup folders & files are found.
backup_folder=backup/

# Names of the files defining the include & exclude patterns for each folder.
include_file_name=includes.txt
exclude_file_name=excludes.txt

Example include file:

*.gif
*.jpg

Example exclude file to pair with above:

*

The script works, but I’m interested on how to improve it. For instance, the error handling feels a bit clumsy.

How to setup disk encryption + periodic backup (SSD -> HDD -> Cloud) on a dual-boot environment?

super users! I have this Dell laptop with 1TB SSD (half Windows 10 and half Ubuntu 21.04), GPT partitions and EFI boot. I also have a 1TB HDD installed on the laptop that I intend to use only to backup the data on my SSD.

I need some high-level guidance on how to do the setup, please. I’d like to enable encryption on the SSD, then periodically trigger the backup to HDD (e.g. every week, or whenever I shutdown the laptop). Ideally, my backup should be able to restore both systems including boot, so if the laptop explodes (or ransomware attack etc.) I can still buy a new one and recover everything like nothing had happened (it’s OK to lose some days of data). That implies I need to periodically upload backup data to a cloud storage.

That’s the use case, but I don’t know how to enable the encryption and what software(s) would allow me to do that. Do any of you have a similar setup? How can I achieve that? I’m open to any suggestions, the only hard requirement is the encryption part, so a simple cloud-sync tool like google drive client is not enough.

Thanks!