How do you make structural changes to Azure SQL Server DBs when you don’t have access via SSMS?

We’re no longer allowed to have access to UAT/PROD environments via SSMS at my organization. The way our deployment process works is tied to git pushes. So when you push to the DEV branch, the DEV web code and DB is updated, likewise when pushing to the QA branch and the UAT branch.

The problem is when there is a structural change to the DB, very often the deployment fails with the error data loss may occur. In the past, with on-prem solutions, when we would publish the DB, if we encountered the data loss may occur error we could uncheck Block incremental deployment if data loss might occur in Visual Studio and the deployment would work. We never incurred any data loss either. Since this option is no longer available, it was suggested we use pre and post deployment scripts.

What do I need to put in the pre and post deployment scripts to prevent the data loss error? Our Visual Studio DB Project already contains all the table/view/SP/function definitions.

My Ubuntu 20.04 server is using A LOT of memory

Sorry in advance that this may be a very beginner question.

I run a web-app on this server, and I run node.js programs with elasticsearch. My server has 16gb of memory, but it constantly uses over 15gb at all times. I know that I can limit my elasticsearch memory use from 4gb to something lower, but I really don’t think my server should be using this much memory. I have screenshots of memory usage. I am hoping I am doing something wrong and that it won’t require me to hardware upgrade the server.

screenshot of memory usage

Thank you so much in advance.

sql server – Cannot run query to get deadlock graph in a timely fashion

I am trying to get deadlock information from a sql-server by using this query

select XEventData.XEvent.value('(data/value)(1)', 'varchar(max)') as DeadlockGraph
FROM
(select CAST(target_data as xml) as TargetData
from sys.dm_xe_session_targets st
join sys.dm_xe_sessions s on s.address = st.event_session_address
where name = 'system_health') AS Data
CROSS APPLY TargetData.nodes ('//RingBufferTarget/event') AS XEventData (XEvent)
where XEventData.XEvent.value('@name', 'varchar(4000)') = 'xml_deadlock_report'

The query however takes forever and returns an empty result.

Why does it take that long time and is the deadlock information you get from this views retroactive so I will be able to pinpoint a deadlock that occured some time ago?

Asia Dedicated server recommendation | Web Hosting Talk

Asia Dedicated server recommendation | Web Hosting Talk


‘);
var sidebar_align = ‘right’;
var content_container_margin = parseInt(‘350px’);
var sidebar_width = parseInt(‘330px’);
//–>









  1. Asia Dedicated server recommendation


    Please recommend me dedicated server company in Asia , Preferably Japan or Korea. Must be located in those country .













Similar Threads


  1. Replies: 15


    Last Post: 06-14-2004, 01:48 PM


  2. Replies: 24


    Last Post: 03-04-2004, 10:20 AM


  3. Replies: 5


    Last Post: 02-23-2004, 04:21 AM


  4. Replies: 8


    Last Post: 05-11-2003, 11:24 AM


  5. Replies: 11


    Last Post: 01-04-2002, 02:33 AM



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  








Windows Server 2016 schedule automatic updates not working as set in Group Policy

I have a Windows Server 2016 VPS where a web app is hosted.it works fine but when windows install updates,nothing works because,in this case CPU usage is 99% and no user can access the app

I’ve setup automatic updates in Group Policy this way:

  • in Computer Configuration/Administrative Templates/Windows Components/Windows Update
    I put Enable for “Configure Automatic Updates”
  • then value 4 for auto download/install and time for every Monday at 3am;not check the automatic maintenance box
  • restart system

but Saturday on 11am no client can work on the web app because of automatic updates(CPU 99%) and I had to disable automatic update and everything went fine

how to fix schedule update to work as set in Group Policy?

thanks

server – wordpress website migration with multi languages/instanaces


Your privacy


By clicking “Accept all cookies”, you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy.




Connection string for my SQL Server connection from C on Linux

I am trying to connect to SQL Server using C program on Linux.

My connection string is this:

DRIVER = {ODBC Driver 17 for SQL Server}; Server = 192.168.0.25,52000;Database = db; UID = ud, PWD = pw;

When I want to perform a test connection, I get this error:

[unixODBC][Microsoft][ODBC Driver 17 for SQL Server]Login failed for
user ”

Something must be wrong with the connection string because it shows empty user ”, not ‘ud’ as in the string above.

So I tried to connect with:

isql -v test

My odbc.ini file is:

[test]
Driver = ODBC Driver 17 for SQL Server
Server = tcp:192.168.0.25,52000
UID = ud
PWD = pw
Database = db

Unfortunately the results were the same:

[unixODBC][Microsoft][ODBC Driver 17 for SQL Server]Login failed for
user ”

So I decided to connect with

isql -v test ud "pw"

And now I finally connected. What is wrong with my connection string/odbc.ini file? I read many tutorials and they seem to be ok. But somehow UID and PWD seems to be ommited.

installation – Install Drupal 9 on Ubuntu Server LTS 20.04: Any improvements to Article?

Used HP Smart Start to establish Raid1
Installed Ubuntu Server LTS 20.04 23/07/2021
Language: English
Keyboard: English(US) English(US) - English(US,Alt,Intl)
Network: enp3s4f1 IPv4
    Subnet 192.168.2.0/24
    Address 192.168.2.3
    Gateway 192.168.2.1
    Name Servers 203.109.191.1, 203.118.191.1, 203.96.152.4, 203.96.152.12, 8.8.8.8, 8.8.4.4
User:   Name Kevin
    Name Server kgwebsite
    User Name kevin
    Password
Installed OpenSSH
Import keys No
Installed Powershell
Reboot
$sudo passwd root
Cntl-D
# cd /
# apt purge cloud-init -y
# rm -Rf /etc/cloud
# apt remove open-iscsi
# apt autoclean && autoremove
# reboot
# apt install net-tools
# ifconfig -a
# apt install ntp
# ntpq -p
# mkdir /media/usb
# apt install unzip zip usbmount gpm
# apt install tasksel
# systemctl enable --now mysql
# systemctl enable --now apache2
# mysql -u root -p (entered pwd)
Cntl-L (clear screen)
# ufw app list
# ufw allow in "Äpache"
# ufw allow in "OpenSSH"
# ufw status
# ufw enable
Browser: http://192.168.2.3 (output "Apache Default Page"
# reboot
# nano /etc/sysctl.conf (Lookout for 100's of "i2c i2c-1 sendbytes" in console before log in)
    # Uncomment the following to stop low-level messages on console
kernel.print = 3 4 1 3
Cntl-O Cntl-X
# apt install mc
    Menu -> "Left -> Listing format"
    Enter User "full name:30 owner group perm space size space mtime space name"
    Menu "Options -> Panel options -> Auto Save Panels Setup" x
    F10
# nano /root/.screenrc
    "defscroll 5000"
Browser helpful: https://computingforgeeks.com/how-to-install-drupal-cms-on-ubuntu-linux/
# service mysql status ("active (running)")
# mysql_secure_installation
    VALIDATE PASSWORD no
    Enter root passwd
    Remove anonymous user yes
    Disallow root login remotely no
    Remove test databas yes
    Reload privilege tables yes
    All done exit
# mysql -u root
>UPDATE mysql.user SET plugin = 'mysql_native_password' WHERE user = 'root';
>FLUSH PRIVILEGES;
>exit;
# mysql_secure_installation
    VALIDATE PASSWORD no
    Enter root passwd
    All done exit
# mysql -u root -p (entered pwd accepted)
>REM DROP USER 'drupal'@'localhost';
>REM DROP DATABASE drupal;
>CREATE DATABASE drupal;
>CREATE USER 'drupal'@'localhost' IDENTIFIED BY 'Str0ngDrupaLP@SS';
>GRANT ALL ON drupal.* TO 'drupal'@'localhost';
>FLUSH PRIVILEGES;
>exit;
# cd /var/www
# mkdir rescuerobot
Windows Notepad++
<VirtualHost *:80>
     ServerName rescuerobot.org
     ServerAlias www.rescuerobot.org
     ServerAdmin admin@rescuerobot.org
     DocumentRoot /var/www/rescuerobot/web/

     CustomLog ${APACHE_LOG_DIR}/access.log combined
     ErrorLog ${APACHE_LOG_DIR}/error.log

      <Directory /var/www/rescuerobot/web>
            Options Indexes FollowSymLinks
            AllowOverride All
            Require all granted
            RewriteEngine on
            RewriteBase /
            RewriteCond %{REQUEST_FILENAME} !-f
            RewriteCond %{REQUEST_FILENAME} !-d
            RewriteRule ^(.*)$ index.php?q=$1 (L,QSA)
   </Directory>
</VirtualHost>
Menu -> Edit -> EOL Conversion (to Unix (LF)
Saved rescuerobot.conf to USB stick
# lsblk
# mount /dev/sdb1 /media/usb
# ls /media/usb
# cp -av /media/usb/rescuerobot.conf /etc/apache2/sites-available
# umount /media/usb
# apt install php-{cli,fpm,json,common,mysql,zip,gd,intl,mbstring,curl,xml,pear,tidy,soap,bcmath,xmlrpc}
# nano /etc/php/7.4/apache2/php.ini
    enter: memory_limit = 256M
    enter: date.timezone = Pacific/Auckland
    uncomment:  extension = openssh
            extension = curl
            extension = gd2
            extension = mbstring
    Cntl-O Cntl-X
# a2enmod rewrite
# apachectl -t
# a2dismod mpm_event
# a2enmod mpm_prefork
# a2enmod php7.4
# a2ensite rescuerobot (must copy rescuerobot.conf to sites-enabled)
# nano /etc/apache2/apache2.conf
    #Global configuration
    ServerName kgdomain.kgwebsite
# a2dissite 000-default (check sites-available and sites-enabled)
# apache2ctl configtest
# systemctl restart apache2
# mkdir /var/www/rescuerobot/web
# cp -av /media/usb/info.php /var/www/rescuerobot/web
Browser: http://192.168.2.3/info.php (Works!)
# rm -Rv /var/www/rescuerobot/*
Windows Notepad++ kginstall
Browser helpful: https://getcomposer.org/download/
php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
php -r "if (hash_file('sha384', 'composer-setup.php') === '756890a4488ce9024fc62c56153228907f1545c228516cbf63f885e036d37e9a59d27d63f46af1d4d07ee0f76181c7d3') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;"
php composer-setup.php
php -r "unlink('composer-setup.php');"
# cp -av /media/usb/kginstall /var/www
# cd /var/www
# ./kginstall
# mkdir bin
# cp -av composer.phar ./bin
# ./bin/composer.phar create-project drupal/recommended-project:9.2.2 rescuerobot
# ./bin/composer.phar require drush/drush
# mkdir /var/www/rescuerobot/web/sites/default/files
# chmod drwxrwxr-x files
# cp -av default.settings.php settings.php
Browser: http://192.168.2.3/core/install.php (resolve Requirements problem: e.g. File System) find small link/button at bottom of the web page "Try again"
# cd /var/www/rescuerobot
# chown -R root:www-data .
"Try again"
Install completed.

None of the above is guaranteed to be correct and complete. The mysql script has been modified to run on mysql 8. Please treat this article as a guide. Any advice or corrections would be much appreciated.
Kevin.

Virtual Private Server

Using a VPS (virtual private server) is the perfect middle ground for many businesses. Shared web hosting may be inexpensive and newbie-friendly, but lacks the versatility, functionality, and scalability that growing businesses need. On the other hand, dedicated servers are complex to set up and comparatively expensive.

One of the top choice for the best VPS hosting is PDHosting. Visit their website to see VPS hosting plans.

aws – How should the server architecture of a service look like that stores files from a desktop application in the cloud (S3/Cloud storage)?

I developed a desktop application and I am in the process of adding support for online cloud storage. The main requirement is to allow the user to store files in the cloud while being able to delete them locally to save space (this is not possible via Dropbox nor Google Drive).

My initial idea is to setup a server with Nginx that accepts incoming connections and forwards them to a webservice by acting as a forward proxy.

If the incoming request is a download/upload the query is redirected to the S3/GCS server. I want to avoid a direct connection to the S3/GCS container. Is this a suitable architecture?

TLDR: How should an architecture look like where a desktop application can send files to a custom cloud server.

Dropbox and Google Drive are not suitable for my workflow as they don’t allow to delete a file locally but keeping them in the cloud. Files and directories are always synced.

DreamProxies - Cheapest USA Elite Private Proxies 100 Private Proxies 200 Private Proxies 400 Private Proxies 1000 Private Proxies 2000 Private Proxies 5000 Private Proxies ExtraProxies.com - Buy Cheap Private Proxies Buy 50 Private Proxies Buy 100 Private Proxies Buy 200 Private Proxies Buy 500 Private Proxies Buy 1000 Private Proxies Buy 2000 Private Proxies ProxiesLive.com Proxies-free.com New Proxy Lists Every Day Proxies123.com Buy Cheap Private Proxies; Best Quality USA Private Proxies