## office 365 – Video failed to upload to Microsoft Stream

So we usually record our meetings through Teams and these get automatically uploaded to Stream. However yesterday the auto upload failed presenting this message:

I downloaded the video manually (through Teams). I was hoping to upload this video to our Stream manually, however everytime I try I receive the following error message:

video.mp4 failed to upload. Changes to this Microsoft Stream tenant are temporarily disabled. Please try again later.
Failed: video.mp4

Any ideas what this is to do with?

I have tried multiple times yesterday and today.
Also asked another team member to try and they receive the same error message.

## Background Video – Stream vs Host / Mobile vs Desktop SEO effect

Background videos have no direct impact on SEO. Google doesn’t penalize for background video any more than it does for images.

This is all down to the implementation details.

• Will the video delay the load of the usable elements of the page?
• Does the video make it harder to use the rest of the page?

## algorithms – Distributed predicate computation on event stream

My question is actually a request for papers, articles, texts or books on the problem that I’m trying to solve on my work.

I’m working on a program that computes a predicate value (true or false) for a given object in a distributed system in which there is a stream of events that can change the object’s attributes and, consequentially, the predicate value. Whenever the predicate value changes, the program must send a notification about this change.

For example, consider that there is an object `A` which has an attribute called `name` and consider that there is a predicate `P` which is true when the object’s `name` is equal to `Jhon`.
Each event in the stream has a timestamp and a value for the attribute name. So consider the following sequence of events:

``````e1 = { name: Jhon, timestamp: 1 }
e2 = { name: Jhon, timestamp: 2 }
e3 = { name: Peter, timestamp: 3 }
e4 = { name: Doug, timestamp: 4 }
e5 = { name: Jhon, timestamp: 5 }
``````

Now, the events don’t necessarily show up in the stream in the correct order and, even worst, there are multiple computers parallelly processing this stream of events. However, for simplicity, I’ll go further in this example considering only one computer.

If the events arrive and are processed in the order described above, then the notifications sent should be:

``````P(A) = true when e1 arrives
P(A) = false when e3 arrives
P(A) = true when e5 arrives.
``````

That is the correct sequence of notifications.
Now, imagine that the computer receives the events in the following order:

``````e1, e5, e2, e4, e3
``````

A naive algorithm which doesn’t consider the event’s timestamp would send an incorrect sequence of notifications:

``````P(A) = true when e1 arrives
P(A) = false when e4 arrives
``````

The algorithm that I’m working on considers the timestamps and infers when a notification should have been sent but was not. So when `e3` arrives it will notice that the notification `P(A) = true` for `e5` was not sent.
I would like some references to this problem or to something similar, like some papers dealing with this kind of problem.

The real problem is quite more complex since it involves storing the predicate $$times$$ object state in a database that works as a shared state between the computers processing the stream and I’m talking about thousands of events arriving per second so it’s not possible to keep all events stored in some database.

## network – Have I been hacked ( netstat output too many dgrams and stream connections)

enter image description here

These are the output images of the netstat command I ran it shows that they’re are too many outbound connection and many dgrams and stream . I also tried to capture the output using Wireshark and then reverse checking the IP address to whom does it belongs ( using www.arin.net) it showed up various organisation ( Google, Astricia).

I also tried to turnoff the wifi and then ran netstat but no change in the dgram and stream connections.

## magento2 – Magento 2.4 installation using composer MagentoHackathon/Composer/Magento/Plugin.php failed to open stream: No such file or directory), rolling back

I am trying to install Magento2.4 Open source on the following setup.

VirtualBox 6.1 on Windows 10

Nginx server

Ubuntu 20.4 installed on the Virtualbox

Php 7.3.20

Command used: `composer create-project --repository-url=https://repo.magento.com/ magento/project-community-edition .`

I have given 0777 permission to /var/www/html/qa.magento.com/ directory

I am getting following error `Plugin installation failed (include(/var/www/html/qa.magento.com/vendor/magento/magento-composer-installer/src/MagentoHackathon/Composer/Magento/Plugin.php): failed to open stream: No such file or directory), rolling back`

Detailed Error:

``````muk@muk:/var/www/html/qa.magento.com\$ composer create-project --repository-url=https://repo.magento.com/ magento/project-community-edition .
Creating a "magento/project-community-edition" project at "./"
Installing magento/project-community-edition (2.4.0)
Created project in /var/www/html/qa.magento.com/.
Updating dependencies (including require-dev)
Package operations: 526 installs, 0 updates, 0 removals
Plugin installation failed (include(/var/www/html/qa.magento.com/vendor/magento/magento-composer-installer/src/MagentoHackathon/Composer/Magento/Plugin.php): failed to open stream: No such file or directory), rolling back
- Removing magento/magento-composer-installer (0.1.13)

(RuntimeException)
Could not delete /var/www/html/qa.magento.com/vendor/magento/magento-composer-installer/src/MagentoHackathon:

create-project (-s|--stability STABILITY) (--prefer-source) (--prefer-dist) (--repository REPOSITORY) (--repository-url REPOSITORY-URL) (--add-repository) (--dev) (--no-dev) (--no-custom-installers) (--no-scripts) (--no-progress) (--no-secure-http) (--keep-vcs) (--remove-vcs) (--no-install) (--ignore-platform-reqs) (--) (<package>) (<directory>) (<version>)
``````

## Can the live caption in Google Meet be seen in a live stream of the meeting?

I would like to live stream a Google Meet to Youtube or Facebook, but would also want the closed captioning to also be seen in the live stream. Does the closed captioning pull through to the live stream?

## Media Comments on Activity Stream when using Open Social

I have added the ability for users to upload Media to my OS install. I have a Media Type of Photo and have configured media to create a Message that displays in the activity stream.

For some reason no matter what media activity you comment on in the stream, the comment is displayed to the top most media activity. The stream is sorted by Authored on (descending)

I have easily recreated this on a fresh OS install as well.

Fresh Install
Enable Media
Create Comment type for Media
Create Media Type of Photo
Create new message template to display Media: Photo on the Stream (home) when media is created.
You also need to edit the Activity stream view and remove “Activity: Filter activities for personalised homepage (= )” from the filter. (I still don’t know what this fully filters activities by.)

Error,

essage Notice: Trying to get property ‘target_id’ of non-object in
Drupalsocial_postPluginFieldFieldFormatterCommentPostFormatter->viewElements() (line 115 of
/var/www/opensocial/html/profiles/contrib/social/modules/social_features/social_post/src/Plugin/Field/FieldFormatter/CommentPostFormatter.php)

Complete Code:

``````<?php

namespace Drupalsocial_postPluginFieldFieldFormatter;

use DrupalcommentPluginFieldFieldTypeCommentItemInterface;
use DrupalcommentPluginFieldFieldFormatterCommentDefaultFormatter;
use DrupalCoreFieldFieldItemListInterface;
use DrupalCoreFormFormStateInterface;
use DrupalCoreEntityEntityInterface;
use DrupalcommentCommentManagerInterface;
use DrupalcommentCommentInterface;

/**
* Provides a post comment formatter.
*
* @FieldFormatter(
*   id = "comment_post",
*   module = "social_post",
*   label = @Translation("Comment on post list"),
*   field_types = {
*     "comment"
*   },
*   quickedit = {
*     "editor" = "disabled"
*   }
* )
*/
class CommentPostFormatter extends CommentDefaultFormatter {

/**
* {@inheritdoc}
*/
public static function defaultSettings() {
return (
'order' => 'ASC',
);
}

/**
* {@inheritdoc}
*/
public function viewElements(FieldItemListInterface \$items, \$langcode) {
\$elements = ();
\$output = ();

\$field_name = \$this->fieldDefinition->getName();
\$entity = \$items->getEntity();

\$status = \$items->status;

if (\$status != CommentItemInterface::HIDDEN && empty(\$entity->in_preview) &&
// Comments are added to the search results and search index by
// comment_node_update_index() instead of by this formatter, so don't
// return anything if the view mode is search_index or search_result.
!in_array(\$this->viewMode, ('search_result', 'search_index'))) {
\$comment_settings = \$this->getFieldSettings();

\$comment_count = \$entity->get(\$field_name)->comment_count;

// Only attempt to render comments if the entity has visible comments.
// Unpublished comments are not included in
// should display if the user is an administrator.
\$elements('#cache')('contexts')() = 'user.permissions';

\$mode = \$comment_settings('default_mode');
}

'attributes' => (
'class' => (
'btn',
'btn-flat',
'brand-text-primary',
),
),
);

// Set path to post node.

// Attach the attributes.

}
}
}

// Append comment form if the comments are open and the form is set to
// display below the entity. Do not show the form for the print view mode.
if (\$status == CommentItemInterface::OPEN && \$comment_settings('form_location') == CommentItemInterface::FORM_BELOW && \$this->viewMode != 'print') {
\$elements('#cache')('contexts')() = 'user';
// Check if the post has been posted in a group.
\$group_id = \$entity->field_recipient_group->target_id;
if (\$group_id) {
/** @var DrupalgroupEntityGroup \$group */
}
}
}
\$output('comment_form') = (
'#lazy_builder' => ('comment.lazy_builders:renderForm', (
\$entity->getEntityTypeId(),
\$entity->id(),
\$field_name,
\$this->getFieldSetting('comment_type'),
),
),
'#create_placeholder' => TRUE,
);
}
}

\$elements() = \$output + (
'#comment_type' => \$this->getFieldSetting('comment_type'),
'#comment_display_mode' => \$this->getFieldSetting('default_mode'),
'comment_form' => (),
);
}

return \$elements;
}

/**
* {@inheritdoc}
*/
public function settingsForm(array \$form, FormStateInterface \$form_state) {
\$element = ();
'#type' => 'number',
'#min' => 0,
'#max' => 10,
);
\$orders = (
'ASC' => \$this->t('Oldest first'),
);
\$element('order') = (
'#type' => 'select',
'#title' => \$this->t('Order'),
'#description' => \$this->t('Select the order used to show the list of comments.'),
'#default_value' => \$this->getSetting('order'),
'#options' => \$orders,
);
return \$element;
}

/**
* {@inheritdoc}
*/
public function settingsSummary() {
return ();
}

/**
* {@inheritdoc}
*
*/
\$query = db_select('comment_field_data', 'c');
\$query
->condition('c.entity_id', \$entity->id())
->condition('c.entity_type', \$entity->getEntityTypeId())
->condition('c.field_name', \$field_name)
->condition('c.default_langcode', 1)

\$query->condition('c.status', CommentInterface::PUBLISHED);
}
if (\$mode == CommentManagerInterface::COMMENT_MODE_FLAT) {
}
else {
// See comment above. Analysis reveals that this doesn't cost too
// much. It scales much much better than having the whole comment
// structure.
}

// Limit The number of results.
}

\$cids = \$query->execute()->fetchCol();

if (\$cids) {
}

}

}
``````

## Pornsera.com – XXX Discussions, Share Premium Porn, Homemade & Live Captures – Stream Sharing Community (Responsive) | NewProxyLists

You wanted feedback. We’re discussing your pitiful site, not mine.

This just proves you don’t have a clue what you’re talking about. YesPornPlease end was due to legal battle with owners of Pornhub (MindGeek) Pornhub is likely the LEAST safest!

I was talking about being embedding the safest, Pornhub has even an affiliate program for embedders. About having no clue, you have no idea who and what I am in the warez industry.

We are glad you’ve made this decision. You would be the last person we want on our site. Surprised @M hasn’t banned you far too toxic especially for moderating standards. You should try showing some respect maybe.

You wanted feedback. You got it. You know … if you can’t stand the heat, get out of the kitchen. I was not that toxic (yet). M and I go a long way.

Cnsidering I have over 200 uploads and another user has over 200 uploads, that equals over 400

Are you out of your mind? You must be very confused. I only keep myself busy with profitable business. I wouldn’t stand anywhere too close to that no traffic site.

No … try again. Update: Because I am a too nice of a guy … https://i.postimg.cc/rFzkBPch/screenshot-196.png
He has multiple sites like that. He will be there in the long run instead of the three months you will last.

We are not trying to piss off our visitors. The “Shortening Disaster” is only for non-registered members.

Disaster … pretty good description. And do you really think visitors will register for that fail.

And who’s site are we talking about?

Porn Sera Sera … what never will be, will be, the future’s ours to see …. porn sera sera bye bye….. Doris Day.

## stream processing – How to keep a data warehouse updated?

Suppose there is a system ( like an ERP ) that writes to a database ( not too big, less than 100GB ). You need to export the data from this database to a data warehouse ( like RedShift or BigQuery ) as many times in a day as you can, what would be a good solution for that?
There is this feature in the system that exports only the delta, so this is what I was thinking:

1 – Write an ETL script to query the delta, format in Avro and save it in a bucket ( GCS or S3 )
2 – Trigger a function when the object is inserted, get the object and insert into a staging table ( one for each table in the origin DB )
3 – Trigger a function to merge the staging table into the main table

I’m not too happy with this approach, because it feels so limited. I think I’m missing something here. Should data in a DW be so hard to maintain? I see a lot of examples on how you can insert data into a DW, but very few on how to keep it updated.

Also, suppose that this delta mechanism didn’t exist and we had to use a streaming solution ( like Kinesis ). That would make things even harder, because data will be inputed into the bucket much faster, generating lots of files, so how could I handle a scenario like this given that DW are slow to update row by row ( BigQuery even limits the amount of updates/day )?

## streaming algorithm – A way to express LTL (varient) to enforce a stream of data to satisfy some linear time logic

Linear Time Logic (LTL) is used for system verification. In my case, I am investing some time, to see the feasibility of using LTL this time to enforce a constraint on a stream of data. Enough of generalities, let’s take a simple example:

The operator UNTIL in the expression `u Until v` in LTL means, event `u` until `v`, it is a general formula that an infinite number of signal traces could satisfy. see it’s definition here:
page4

like:

``````u,u,u,v,v,v,...
u,u,u,u,u,u,...
u,v,v,v,v,v,...
``````

In my case, I want to enforce an LTL like formulae to a system receiving a stream of data; Again let’s take the same operator Until.

let’s say we have two input signals, one for constant u, and one for constant v.

``````u,u,u,u,u,u,...
, , , ,v,v,...
``````

The stream processor taking these inputs, if it is an “UNTIL*” node, would output:

``````u,u,u,u,v,v,...
``````

The reason I differentiate UNTIL with an asterisk is the whole point of the question, “u UNTIL* v” is only true when v is taken as output as soon as it appears in the second stream, it is one single trace satisfying “u UNTIL* v” given our input signals. How to express this constraint ?! LTL seems very general for this “constraint enforcing mechanism”.

note: Please bear with me, I am no computer scientist, nor a mathematician, I am an average programmer who tries to learn new things.