office 365 – Video failed to upload to Microsoft Stream

So we usually record our meetings through Teams and these get automatically uploaded to Stream. However yesterday the auto upload failed presenting this message:

recording failed to upload to Stream

I downloaded the video manually (through Teams). I was hoping to upload this video to our Stream manually, however everytime I try I receive the following error message:

video.mp4 failed to upload. Changes to this Microsoft Stream tenant are temporarily disabled. Please try again later.
Failed: video.mp4

Any ideas what this is to do with?

I have tried multiple times yesterday and today.
Also asked another team member to try and they receive the same error message.

Background Video – Stream vs Host / Mobile vs Desktop SEO effect

Background videos have no direct impact on SEO. Google doesn’t penalize for background video any more than it does for images.

This is all down to the implementation details.

  • Will the video delay the load of the usable elements of the page?
  • Does the video make it harder to use the rest of the page?

If the page renders and becomes usable quickly while the video is still loading, it shouldn’t hurt your SEO. Google does measure the speed at which page loads. A large video has the potential to make your page load much more slowly than it had previously loaded. The worst case would be if the page didn’t render at all until the video had completely downloaded. When you implement, be sure to do tests to make sure that the visible portions of the page other than the video load just as quickly as before the video was added.

algorithms – Distributed predicate computation on event stream

My question is actually a request for papers, articles, texts or books on the problem that I’m trying to solve on my work.

I’m working on a program that computes a predicate value (true or false) for a given object in a distributed system in which there is a stream of events that can change the object’s attributes and, consequentially, the predicate value. Whenever the predicate value changes, the program must send a notification about this change.

For example, consider that there is an object A which has an attribute called name and consider that there is a predicate P which is true when the object’s name is equal to Jhon.
Each event in the stream has a timestamp and a value for the attribute name. So consider the following sequence of events:

e1 = { name: Jhon, timestamp: 1 }
e2 = { name: Jhon, timestamp: 2 }
e3 = { name: Peter, timestamp: 3 }
e4 = { name: Doug, timestamp: 4 }
e5 = { name: Jhon, timestamp: 5 }

Now, the events don’t necessarily show up in the stream in the correct order and, even worst, there are multiple computers parallelly processing this stream of events. However, for simplicity, I’ll go further in this example considering only one computer.

If the events arrive and are processed in the order described above, then the notifications sent should be:

P(A) = true when e1 arrives
P(A) = false when e3 arrives
P(A) = true when e5 arrives.

That is the correct sequence of notifications.
Now, imagine that the computer receives the events in the following order:

e1, e5, e2, e4, e3

A naive algorithm which doesn’t consider the event’s timestamp would send an incorrect sequence of notifications:

P(A) = true when e1 arrives
P(A) = false when e4 arrives

The algorithm that I’m working on considers the timestamps and infers when a notification should have been sent but was not. So when e3 arrives it will notice that the notification P(A) = true for e5 was not sent.
This feels a bit like reinventing the wheel, though I’m not aware of any reading about this problem.
I would like some references to this problem or to something similar, like some papers dealing with this kind of problem.

The real problem is quite more complex since it involves storing the predicate $times$ object state in a database that works as a shared state between the computers processing the stream and I’m talking about thousands of events arriving per second so it’s not possible to keep all events stored in some database.

network – Have I been hacked ( netstat output too many dgrams and stream connections)

enter image description here

enter image description here

netstat output

These are the output images of the netstat command I ran it shows that they’re are too many outbound connection and many dgrams and stream . I also tried to capture the output using Wireshark and then reverse checking the IP address to whom does it belongs ( using it showed up various organisation ( Google, Astricia).

I also tried to turnoff the wifi and then ran netstat but no change in the dgram and stream connections.

Please help , any input will be appreciated.

magento2 – Magento 2.4 installation using composer MagentoHackathon/Composer/Magento/Plugin.php failed to open stream: No such file or directory), rolling back

I am trying to install Magento2.4 Open source on the following setup.

VirtualBox 6.1 on Windows 10

Nginx server

Ubuntu 20.4 installed on the Virtualbox

Php 7.3.20

Command used: composer create-project --repository-url= magento/project-community-edition .

I have given 0777 permission to /var/www/html/ directory

I am getting following error Plugin installation failed (include(/var/www/html/ failed to open stream: No such file or directory), rolling back

Detailed Error:

muk@muk:/var/www/html/$ composer create-project --repository-url= magento/project-community-edition .
Creating a "magento/project-community-edition" project at "./"
Installing magento/project-community-edition (2.4.0)
  - Installing magento/project-community-edition (2.4.0): Loading from cache
Created project in /var/www/html/
Loading composer repositories with package information
Updating dependencies (including require-dev)
Package operations: 526 installs, 0 updates, 0 removals
  - Installing magento/magento-composer-installer (0.1.13): Loading from cache
Plugin installation failed (include(/var/www/html/ failed to open stream: No such file or directory), rolling back
  - Removing magento/magento-composer-installer (0.1.13)

  Could not delete /var/www/html/

create-project (-s|--stability STABILITY) (--prefer-source) (--prefer-dist) (--repository REPOSITORY) (--repository-url REPOSITORY-URL) (--add-repository) (--dev) (--no-dev) (--no-custom-installers) (--no-scripts) (--no-progress) (--no-secure-http) (--keep-vcs) (--remove-vcs) (--no-install) (--ignore-platform-reqs) (--) (<package>) (<directory>) (<version>)

Can the live caption in Google Meet be seen in a live stream of the meeting?

I would like to live stream a Google Meet to Youtube or Facebook, but would also want the closed captioning to also be seen in the live stream. Does the closed captioning pull through to the live stream?

Media Comments on Activity Stream when using Open Social

I have added the ability for users to upload Media to my OS install. I have a Media Type of Photo and have configured media to create a Message that displays in the activity stream.

For some reason no matter what media activity you comment on in the stream, the comment is displayed to the top most media activity. The stream is sorted by Authored on (descending)

I have easily recreated this on a fresh OS install as well.

Fresh Install
Enable Media
Create Comment type for Media
Create Media Type of Photo
Add Comment field to Media Type: Photo
Create new message template to display Media: Photo on the Stream (home) when media is created.
You also need to edit the Activity stream view and remove “Activity: Filter activities for personalised homepage (= )” from the filter. (I still don’t know what this fully filters activities by.)


essage Notice: Trying to get property ‘target_id’ of non-object in
Drupalsocial_postPluginFieldFieldFormatterCommentPostFormatter->viewElements() (line 115 of

Complete Code:


namespace Drupalsocial_postPluginFieldFieldFormatter;

use DrupalcommentPluginFieldFieldTypeCommentItemInterface;
use DrupalcommentPluginFieldFieldFormatterCommentDefaultFormatter;
use DrupalCoreFieldFieldItemListInterface;
use DrupalCoreFormFormStateInterface;
use DrupalCoreEntityEntityInterface;
use DrupalcommentCommentManagerInterface;
use DrupalcommentCommentInterface;
use DrupalCoreLink;

 * Provides a post comment formatter.
 * @FieldFormatter(
 *   id = "comment_post",
 *   module = "social_post",
 *   label = @Translation("Comment on post list"),
 *   field_types = {
 *     "comment"
 *   },
 *   quickedit = {
 *     "editor" = "disabled"
 *   }
 * )
class CommentPostFormatter extends CommentDefaultFormatter {

   * {@inheritdoc}
  public static function defaultSettings() {
    return (
      'num_comments' => 2,
      'order' => 'ASC',

   * {@inheritdoc}
  public function viewElements(FieldItemListInterface $items, $langcode) {
    $elements = ();
    $output = ();

    $field_name = $this->fieldDefinition->getName();
    $entity = $items->getEntity();

    $status = $items->status;

    $comments_per_page = $this->getSetting('num_comments');

    if ($status != CommentItemInterface::HIDDEN && empty($entity->in_preview) &&
      // Comments are added to the search results and search index by
      // comment_node_update_index() instead of by this formatter, so don't
      // return anything if the view mode is search_index or search_result.
      !in_array($this->viewMode, ('search_result', 'search_index'))) {
      $comment_settings = $this->getFieldSettings();

      $comment_count = $entity->get($field_name)->comment_count;

      // Only attempt to render comments if the entity has visible comments.
      // Unpublished comments are not included in
      // $entity->get($field_name)->comment_count, but unpublished comments
      // should display if the user is an administrator.
      $elements('#cache')('contexts')() = 'user.permissions';
      if ($this->currentUser->hasPermission('access comments') || $this->currentUser->hasPermission('administer comments')) {
        $output('comments') = ();

        if ($comment_count || $this->currentUser->hasPermission('administer comments')) {
          $mode = $comment_settings('default_mode');
          $comments = $this->loadThread($entity, $field_name, $mode, $comments_per_page, FALSE);
          if ($comments) {
            $build = $this->viewBuilder->viewMultiple($comments);
            $output('comments') += $build;

          if ($comments_per_page && $comment_count > $comments_per_page) {
            $t_args = (':num_comments' => $comment_count);
            $more_link = $this->t('Show all :num_comments comments', $t_args);

            // Set link classes to be added to the button.
            $more_link_options = (
              'attributes' => (
                'class' => (

            // Set path to post node.
            $link_url = $entity->urlInfo('canonical');

            // Attach the attributes.

            // Build the link.
            $more_button = Link::fromTextAndUrl($more_link, $link_url);
            $output('more_link') = $more_button;

      // Append comment form if the comments are open and the form is set to
      // display below the entity. Do not show the form for the print view mode.
      if ($status == CommentItemInterface::OPEN && $comment_settings('form_location') == CommentItemInterface::FORM_BELOW && $this->viewMode != 'print') {
        // Only show the add comment form if the user has permission.
        $elements('#cache')('contexts')() = 'user';
        $add_comment_form = FALSE;
        // Check if the post has been posted in a group.
        $group_id = $entity->field_recipient_group->target_id;
        if ($group_id) {
          /** @var DrupalgroupEntityGroup $group */
          $group = entity_load('group', $group_id);
          if ($group->hasPermission('add post entities in group', $this->currentUser) && $this->currentUser->hasPermission('post comments')) {
            $add_comment_form = TRUE;
        elseif ($this->currentUser->hasPermission('post comments')) {
          $add_comment_form = TRUE;
        if ($add_comment_form) {
          $output('comment_form') = (
            '#lazy_builder' => ('comment.lazy_builders:renderForm', (
            '#create_placeholder' => TRUE,

      $elements() = $output + (
        '#comment_type' => $this->getFieldSetting('comment_type'),
        '#comment_display_mode' => $this->getFieldSetting('default_mode'),
        'comments' => (),
        'comment_form' => (),
        'more_link' => (),

    return $elements;

   * {@inheritdoc}
  public function settingsForm(array $form, FormStateInterface $form_state) {
    $element = ();
    $element('num_comments') = (
      '#type' => 'number',
      '#min' => 0,
      '#max' => 10,
      '#title' => $this->t('Number of comments'),
      '#default_value' => $this->getSetting('num_comments'),
    $orders = (
      'ASC' => $this->t('Oldest first'),
      'DESC' => $this->t('Newest first'),
    $element('order') = (
      '#type' => 'select',
      '#title' => $this->t('Order'),
      '#description' => $this->t('Select the order used to show the list of comments.'),
      '#default_value' => $this->getSetting('order'),
      '#options' => $orders,
    return $element;

   * {@inheritdoc}
  public function settingsSummary() {
    return ();

   * {@inheritdoc}
   * @see DrupalcommentCommentStorage::loadThead()
  public function loadThread(EntityInterface $entity, $field_name, $mode, $comments_per_page = 0, $pager_id = 0) {
    // @TODO: Refactor this to use CommentDefaultFormatter->loadThread with dependency injection instead.
    $query = db_select('comment_field_data', 'c');
    $query->addField('c', 'cid');
      ->condition('c.entity_id', $entity->id())
      ->condition('c.entity_type', $entity->getEntityTypeId())
      ->condition('c.field_name', $field_name)
      ->condition('c.default_langcode', 1)
      ->addMetaData('base_table', 'comment')
      ->addMetaData('entity', $entity)
      ->addMetaData('field_name', $field_name);

    $comments_order = $this->getSetting('order');

    if (!$this->currentUser->hasPermission('administer comments')) {
      $query->condition('c.status', CommentInterface::PUBLISHED);
    if ($mode == CommentManagerInterface::COMMENT_MODE_FLAT) {
      $query->orderBy('c.cid', $comments_order);
    else {
      // See comment above. Analysis reveals that this doesn't cost too
      // much. It scales much much better than having the whole comment
      // structure.
      $query->addExpression('SUBSTRING(c.thread, 1, (LENGTH(c.thread) - 1))', 'torder');
      $query->orderBy('torder', $comments_order);

    // Limit The number of results.
    if ($comments_per_page) {
      $query->range(0, $comments_per_page);

    $cids = $query->execute()->fetchCol();

    $comments = ();
    if ($cids) {
      $comments = entity_load_multiple('comment', $cids);

    return $comments;

} – XXX Discussions, Share Premium Porn, Homemade & Live Captures – Stream Sharing Community (Responsive) | NewProxyLists

Thanks for your advice! Please don’t worry about our activities and instead worry about your own.

You wanted feedback. We’re discussing your pitiful site, not mine.

This just proves you don’t have a clue what you’re talking about. YesPornPlease end was due to legal battle with owners of Pornhub (MindGeek) Pornhub is likely the LEAST safest!

I was talking about being embedding the safest, Pornhub has even an affiliate program for embedders. About having no clue, you have no idea who and what I am in the warez industry.

We are glad you’ve made this decision. You would be the last person we want on our site. Surprised @M hasn’t banned you far too toxic especially for moderating standards. You should try showing some respect maybe.

Not a single hair thinks about uploading to your site. Waste of time.

You wanted feedback. You got it. You know … if you can’t stand the heat, get out of the kitchen. I was not that toxic (yet). M and I go a long way.

Cnsidering I have over 200 uploads and another user has over 200 uploads, that equals over 400

Uploads … are you mad? But anyway, less than the ‘thousands’ you advertise.

You are mad that he removed you from uploading to MovieMafia, get over it. You abused users links by removing them and adding your own.

Are you out of your mind? You must be very confused. I only keep myself busy with profitable business. I wouldn’t stand anywhere too close to that no traffic site.

No Thanks.*Update* decided to check it, you talking about this PandaMovies?

No … try again. Update: Because I am a too nice of a guy …
He has multiple sites like that. He will be there in the long run instead of the three months you will last.

We are not trying to piss off our visitors. The “Shortening Disaster” is only for non-registered members.

Can you provide links to your sites?

Disaster … pretty good description. And do you really think visitors will register for that fail.

And who’s site are we talking about?

Porn Sera Sera … what never will be, will be, the future’s ours to see …. porn sera sera bye bye….. Doris Day.

stream processing – How to keep a data warehouse updated?

Suppose there is a system ( like an ERP ) that writes to a database ( not too big, less than 100GB ). You need to export the data from this database to a data warehouse ( like RedShift or BigQuery ) as many times in a day as you can, what would be a good solution for that?
There is this feature in the system that exports only the delta, so this is what I was thinking:

1 – Write an ETL script to query the delta, format in Avro and save it in a bucket ( GCS or S3 )
2 – Trigger a function when the object is inserted, get the object and insert into a staging table ( one for each table in the origin DB )
3 – Trigger a function to merge the staging table into the main table

I’m not too happy with this approach, because it feels so limited. I think I’m missing something here. Should data in a DW be so hard to maintain? I see a lot of examples on how you can insert data into a DW, but very few on how to keep it updated.

Also, suppose that this delta mechanism didn’t exist and we had to use a streaming solution ( like Kinesis ). That would make things even harder, because data will be inputed into the bucket much faster, generating lots of files, so how could I handle a scenario like this given that DW are slow to update row by row ( BigQuery even limits the amount of updates/day )?

streaming algorithm – A way to express LTL (varient) to enforce a stream of data to satisfy some linear time logic

Linear Time Logic (LTL) is used for system verification. In my case, I am investing some time, to see the feasibility of using LTL this time to enforce a constraint on a stream of data. Enough of generalities, let’s take a simple example:

The operator UNTIL in the expression u Until v in LTL means, event u until v, it is a general formula that an infinite number of signal traces could satisfy. see it’s definition here:



In my case, I want to enforce an LTL like formulae to a system receiving a stream of data; Again let’s take the same operator Until.

let’s say we have two input signals, one for constant u, and one for constant v.

 , , , ,v,v,...

The stream processor taking these inputs, if it is an “UNTIL*” node, would output:


The reason I differentiate UNTIL with an asterisk is the whole point of the question, “u UNTIL* v” is only true when v is taken as output as soon as it appears in the second stream, it is one single trace satisfying “u UNTIL* v” given our input signals. How to express this constraint ?! LTL seems very general for this “constraint enforcing mechanism”.

note: Please bear with me, I am no computer scientist, nor a mathematician, I am an average programmer who tries to learn new things.