ssl – Radius Proxy Proxy authentication failed to start

So I have apache2 installed and running as a proxy, now I have to add Radius auth to allow only our own users to access our servers through the proxy, here is the configuration that I have for apache radius auth:

SSLProxyEngine on
SSLCertificateFile /etc/apache2/ssl/vpn.pem
SSLCertificateKeyFile /etc/apache2/ssl/vpn.key

AddRadiusAuth radius-server-ip:1812 radius-server-key 5:3
AddRadiusCookieValid 60


AuthType Basic
AuthName "Please Enter Your JumpCloud Credentials"
AuthBasicProvider radius
AuthRadiusActive On
AuthRadiusCookieValid 5
AuthRadiusAuthoritative on
Require valid-user

ProxyRequests On
ProxyVia On

Order Deny,Allow
Allow from all

ErrorLog ${APACHE_LOG_DIR}/error_forward_proxy.log
CustomLog ${APACHE_LOG_DIR}/access_forward_proxy.log combined

but every time i try to start the apache server, it just says that it could not boot, i checked the error and the log of the error. access, but there is literally nothing. Am I missing something here with the config ??

Any help or advice is greatly appreciated!

Cheers!

linux – Steps to replace the failed desktop drive in LinuxMint?

The second (/ home) drive on my desktop broke down in one night and I miss my google-fu for the steps needed to recover with a new drive.

I have backups of the drive, so I do not need to recover data, I just need to know how to integrate the new drive into the existing system.

The application of a function on multiple columns of a failed dataset is missing

Consider a dataset with missing values:

ds={<|"timestamp" -> 
DateObject({2000, 1, 1, 1, 0, 0}, "Instant", "Gregorian", 2.), 
"BASCH" -> 108., "BONAP" -> Missing("Unrecognized", "n/d"), 
"PA18" -> 65., 
"VERS" -> 47.|>, <|"timestamp" -> 
DateObject({2000, 1, 1, 2, 0, 0}, "Instant", "Gregorian", 2.), 
"BASCH" -> 104., "BONAP" -> 60., "PA18" -> 77., 
"VERS" -> 42.|>, <|"timestamp" -> 
 DateObject({2000, 1, 1, 3, 0, 0}, "Instant", "Gregorian", 2.), 
"BASCH" -> 97., "BONAP" -> 58., "PA18" -> 73., 
"VERS" -> 34.|>, <|"timestamp" -> 
DateObject({2000, 1, 1, 4, 0, 0}, "Instant", "Gregorian", 2.), 
"BASCH" -> 77., "BONAP" -> 52., "PA18" -> 57., 
"VERS" -> 29.|>, <|"timestamp" -> 
DateObject({2000, 1, 1, 5, 0, 0}, "Instant", "Gregorian", 2.), 
"BASCH" -> 79., "BONAP" -> 52., "PA18" -> 64., "VERS" -> 28.|>}

I can easily get the average of a given key even with missing values:

no2(Mean, "BONAP")
(*64.0017*)

But if I try to apply Average to 2 columns, the missing values ​​become a problem:

no2(Mean, {"BONAP", "PA18"})

This returns a dataset with missing values. I suspect that this is not the correct syntax, because in the first case, the result is numeric, while the second operation returns a set of data. How to apply a function to multiple columns?

Edit:

It works:

no2(Mean, #) & /@ {"BASCH", "BONAP", "PA18", "VERS"}

But that's not what I'm looking for. I am looking for a way to do it as part of the dataset.

mmap () failed: [12] Can not allocate a memory problem

When I try to execute a command, I get an error at the bottom of the command line

mmap () failed: (12) Can not allocate memory

enter the description of the image here

Please help me as soon as possible.

Content polling Webpart using a calculated page field value in a failed filter

I can not get a calculated page field (DocIdDelimited) as a filter value in a Content Query WebPart. DocIdDelimited must be resolved in text. I am asking if the text of the calculated field is contained in another text field (RelatedDocId).

I've seen people use calculated filter values ​​with dates, so why does not this computed text field work?

RelatedDocId and the output type for DocIdDelimited are both a single line of text.

If I create the filter with a text page field with the same content, the filter works.

enter the description of the image here

jboss – EAP6 – if the request failed while deploying the application

I'm trying to deploy my application on a development server and this fails during the first if statement in the cli scenario. When I connect to the console, I can see the Datasource present and active, but for some reason, the script fails.

What would prevent the script from finding an active data source?

Code

cd /profile=@jboss.profile@

if (outcome == success) of ./subsystem=datasources/data-source=DataSource:read-resource()
    data-source remove --name=DataSource --profile=@jboss.profile@
end-if

Error

if request failed: JBAS010839: Operation failed or was rolled back on all servers.

SQL Server – Failed to create appdomain "master.sys[runtime].2 "

I receive this error in the SQL Server log:

Unable to create application domain "master.sys [runtime] .2"
Unable to load the file or the System.Data assembly. Not enough storage is available to process this command

After this error, my web application does not work.

I looked for it. But no adequate solution.

Help me, please.

electrum – Esplora on local elementsregtest – analysis failed: data is not fully consumed during explicit deserialization

this is related to esplora (the block explorer) and its back-end electrs API. Is it possible to run esplora for a local element test?

When I run electrs it is the error I get back:

DEBUG - Server listening on 127.0.0.1:44224
DEBUG - Running accept thread
INFO - NetworkInfo { version: 180101, subversion: "/Elements Core:0.18.1.1/" }
INFO - BlockchainInfo { chain: "liquidregtest", blocks: 1, headers: 1, bestblockhash: "9cc7c8fb1c8e2e1e8ed184f5e31548eb5859b74e7552ba8841c41aeeb24d0ae3", pruned: false, verificationprogress: 0.334, initialblockdownload: Some(false) }
DEBUG - opening DB at "./db/liquidregtest/newindex/txstore"
DEBUG - 0 blocks were added
DEBUG - opening DB at "./db/liquidregtest/newindex/history"
DEBUG - 0 blocks were indexed
DEBUG - opening DB at "./db/liquidregtest/newindex/cache"
DEBUG - downloading all block headers up to 9cc7c8fb1c8e2e1e8ed184f5e31548eb5859b74e7552ba8841c41aeeb24d0ae3
TRACE - downloading 2 block headers
ERROR - server failed: Error: failed to parse header 000000a021cab1e5da4718ea140d9716931702422f0e6ad915c8d9b583cac2706b2a9000ac20a615d9b0d4df3e3ac2cb7018a07bd314d6bb715a57adead7c03e208b3658890e9d5d01000000012200204ae81572f06e1b88fd5ced7a1a000945432e83e1551e6f721ee9c00b8cc332604b00000000010151
Caused by: parse failed: data not consumed entirely when explicitly deserializing

The command used to launch electrs is:

cargo run --features liquid --release --bin electrs -- -vvvv --daemon-dir ~/.elements/elements-0.18.1.1/elementsdir/ --daemon-rpc-addr 127.0.0.1:18886 --cookie user:password --network liquidregtest -v

And there is an elementd running on this 127.0.0.1.118886 whose name is liquidregtest (in the configuration file is the line: chain = liquidregtest), in fact with the elements-cli j & I successfully requested an address and executed a generated command, which returned the "9cc7c8fb1c8e2e1e8ed184f5e31548eb5859b74e7552b8841c41aeeb24d0ae3" that you see in the debug (which means that the block was created).

Failed to import boot sub-module configuration

I've tried to implement Hero with Xeno Hero in the Bootstrap paragraphs of Drupal 8. I have encountered an error while trying to import the sub-module configuration. I do not know if I do it right, but I ended up having that error.

The configuration can not be imported because validation failed for the following reasons:
The paragraphs of configuration.type.paragraphe.xeno_hero.default depend on the configuration (field.field.paragraph.xeno_hero.xeno_content, field.field.paragraph.xeno_hero.xeno_invert, field.function.xeno_hero.xeno_offset, field.function.xeno_hero xeno_hero.xeno_overlay, field.field.inéa.xeno_hero.xeno_parallax, paragraphs.types.xeno_hero) that will not exist after import.

What I did is to import a single item, but I do not know what type of configuration to choose, so I tried Display of the entity form and Display of the entity as good as The fields.

Could someone just point me in the right direction or show me clues that I could follow to get things right?