Roubo, furto, achado de cadáver, acidente e homicídio; confira o plantão policial de Quixeramobim

Um final de semana com uma série de ocorrências em Quixeramobim, conforme boletim divulgado pela Polícia Militar, através do Centro de Operações Policiais Militares (COPOM). Acompanhe:

Sábado, 23

Por volta das 09h34min, um homem de 58 anos, residente na Rua Ana Xavier de Jesus, no bairro José Airton Machado, informou à PM que após beber com uma mulher  na noite da sexta-feira, identificada apenas pela inicial B., a mesma teria retornado à sua casa durante a madrugada do sábado e furtado um botijão de gás, um DVD da marca Sony, uma faca tipo peixeira e um frasco de gel para massagem. A Polícia segue diligenciando.

Já durante a noite, uma pessoa foi vítima de roubo na Rua Francisco Medeiros, no bairro Salviano Carlos da Silva. A mulher, de 20 anos, informou aos policiais que se deslocava para sua residência quando foi abordada por um indivíduo em uma motocicleta não identificada que anunciou um assalto e roubou seu celular da marca Samsung J2, R$ 115,00 e uma bolsa contendo seus documentos, em seguida o indivíduo evadiu-se do local tomando rumo ignorado. O autor ainda não foi identificado, nem localizado.

Domingo, 24

O Copom recebeu uma ligação, via 190, às 9 horas, informando de um achado de cadáver na Rua Oto Rocha, no Distrito de São Miguel, zona rural. A viatura foi acionada para atender a ocorrência. A vítima foi identificada como sendo o senhor L. P. S.. Segundo a filha da vítima, a mesma viu seu pai pela última vez na sexta-feira, 22, e que somente ontem, 24,  foi procurá-lo em sua residência e já o encontrou em estado de decomposição.

A Polícia Militar cumpriu ainda mandado em aberto, às 14h33min, no clube da AABB. Um homem, identificado por F. A. G. S., com mandado em aberto, se encontrava no local. O homem apresentou um Alvará de Soltura, mas como no Sistema de Consulta da Secretaria de Segurança Pública (SIP) ainda constava o mandando em aberto, o mesmo foi conduzido para Delegacia Regional de Polícia Civil de Quixadá, para ser averiguado a situação do mesmo.

Um dos casos mais graves foi registrado por volta das 21 horas: um homicídio a faca. A Polícia recebeu uma ligação dando conta de que uma mulher estava sendo agredida por um indivíduo na Rua Wellington Martins, no bairro Salviano Carlos (Matadouro). Ao chegar no local, os policiais foram informados que dois indivíduos,. um deles identificado por P. R. e o outro, seu filho, ainda não identificado, teriam lesionado a faca um indivíduo identificado como sendo J. G. A. A..

A vitima foi socorrida por populares, mas veio a óbito antes de chegar no hospital. Os acusados fugiram tomando rumo ignorado. Segundo populares, a confusão teria começado com a esposa da vitima e quando o mesmo foi intervir, o acusado P. R. teria segurado a vitima enquanto seu filho teria desferido as facadas: três lesões na altura do pescoço. Diligências estão sendo realizadas, mas até o momento ainda não foram localizados.

Hoje, 25
A meia noite de hoje, ocorreu um acidente de trânsito, envolvendo um pedestre e um ônibus. Ao chegarem no local, os policiais se depararam com um senhor, J. J. S. S., caído ao solo com uma lesão grave na perna esquerda. De imediato, foi acionado uma ambulância do SAMU para socorrer a vítima.

Ainda no local se encontrava o motorista de ônibus, A. S. O.. Segundo o condutor, que fazia a rota Fortaleza-Tocantins, quando passava pela CE-060 quando se deparou com a vítima deitada na pista, e que o mesmo tentou desviar, mas ainda chegou a bater na perna do mesmo.

Gleisi é reeleita presidente do PT: “Quero Moro preso e Lula presidente”

A deputada Gleisi Hoffmann (PT-PR) foi reeleita presidente nacional do PT neste domingo (24). Com o apoio do ex-presidente Lula, Gleisi recebeu a grande maioria dos votos dos delegados partidários. Foram 558 votos a favor de Gleisi, 131 para Margarida Salomão e 91 para Valter Pomar. Paulo Teixeira retirou o nome da disputa para apoiar a reeleição da candidata de Lula.

Ao discursar após a contagem de votos, Gleisi agradeceu, então, o apoio dos “companheiros” e pregou a união do PT. “Essa disputa acaba aqui. Agora, o essencial para a nossa luta é a unidade do PT”, disse a deputada, que ficará à frente do partido por mais dois anos. (Do Congresso em Foco)

Delegacia investiga 500 inquéritos instaurados sobre falsificação de documentos no Ceará

Tentativas de emissão de identidades com informações falsas e o próprio uso de documentos ilegal vem chamando atenção da Polícia Civil do Ceará. Neste ano de 2019, só na Delegacia de Defraudações e Falsificações (DDF), foram contabilizados, aproximadamente, 500 inquéritos para apurar estes tipos de crimes.

Conforme o titular da DDF, delegado Eduardo Tomé, não existe um perfil de suspeito. Entre os autuados há, inclusive, idosos que tentaram emitir documentos com dados de outras pessoas com objetivo de fraudar a Previdência Social. De janeiro deste ano até a última quarta-feira (20), 21 pessoas foram presas em flagrante, por policiais da DDF, em posse de documentos falsificados.

Parte dos flagrantes aconteceu com apoio da Coordenadoria de Identificação Humana e Perícias Biométricas da Perícia Forense do Ceará (Pefoce). O trabalho da Pefoce, de cruzar informações contidas em um banco de dados, vem impedindo que suspeitos recebam identidades falsas. (Do G1-CE)

Mais da metade dos prefeitos do Ceará responde a processos na Justiça

Mais da metade dos prefeitos do Ceará respondem a processos na Justiça por ações de improbidade administrativa. Um levantamento do G1 mostra que 96 dos 184 gestores respondem por esse tipo de ação. O recordista é o prefeito Raimundo Marcelo Arcanjo, de Santana do Acaraú, que responde a 12 processos em menos de três anos de mandato. Juntos, os prefeitos são alvos de 246 processos na Justiça por Improbidade Administrativa.

Em Santa Quitéria, o prefeito Tomás Antônio Albuquerque de Paula Pessoa autorizou a construção de um parque de vaquejada particular com a utilização de servidores públicos e de máquinas pertencentes ao município.

À Câmara Municipal o prefeito reconheceu a realização do serviço porque, de acordo com ele, esta prática era algo “natural”, que já tinha sido adotada em outros empreendimentos particulares da cidade. A obra foi descoberta por registros feitos por um drone, que captou a construção e apontou uma pavimentação de via em área rural.

O promotor de Justiça Dérick Funck, responsável pela ação de Santa Quitéria, requereu a condenação por dano moral coletivo no valor de R$ 500 mil, bem como o afastamento do cargo. Entre as irregularidades apontadas pelo Ministério Público, estão:

– infração à Lei de Responsabilidade Fiscal (LRF);
– contratação indevida de máquinas com licitação fraudulenta;
– contratação de escritório de advocacia com ligação pessoal ao gestor e licitação direcionada;
– sucateamento e fraude no fornecimento do transporte escolar.

O G1 tentou ouvir o presidente da Associação dos Municípios do Estado do Ceará (Aprece), Francisco Nilson Alves Diniz, e os prefeitos citados para comentar o assunto, mas os gestores não responderam ao pedido de entrevista. (Do G1-CE)

Os novos traços da primeira televisão da região centro do Ceará

Mostrar a realidade de quem vive no sertão cearense é tarefa para quem presencia a realidade do povo sertanejo. E é isso a que se propõe a nova cara da SerTão TV: A primeira emissora da região centro do Ceará.

Durante o planejamento e execução dos primeiros ajustes visuais da marca da emissora, a discussão se concentrou no que mostrar. O objetivo não era traçar o solo rachado e a seca, que é uma realidade, tão retratados e lembrados quando se falam dos sertões, mas mostrar a nossa região com outros olhos. Sentiu-se que era hora de trazer sua beleza para os holofotes.

Após isso, foi hora de falar das cores e tonalidades. Não foi deixado o amarelo do sol, nem o azul anil do céu de fora, mas, foram acrescentadas as cores avermelhadas da argila e o verde e imponente xique-xique, que permanece em pé durante todo o período de seca, sendo símbolo da garra do sertanejo.

Passadas tais discussões, foi o momento de adicionar as pessoas. E, para além dos vaqueiros, as xilogravuras mostram o sertão artístico, do homem e da mulher, se configurando como um espaço de todos, onde cabem gostos, saberes, sabores, amores e muita festa, traçando um perfil de um povo que, apesar de tudo, tem força para seguir em frente.

Assim foi formada a nova arte da TV, juntamente com a arquiteta Marcela Simão, que resume o sertão em três palavras: Beleza, garra e força. E é essa janela que a SerTão TV veio abrir. Está na hora de escancarar essa região tão rica e diversa para o mundo. E isso será feito com toda responsabilidade, respeito, carinho e atenção que nosso povo merece, contando com a chancela do Sistema Maior de Comunicação e de todas as outras empresas que compõem o grupo e, principalmente, através de uma parceria com a Brisanet Telecomunicações, que levará o sinal da emissora para sua plataforma HDTV.

É um trabalho conjunto, que exigirá tempo, mas que quer fazer o público se adaptar com a nova marca, com o colorido, com as letras e com a forma, de modo que, ao ver no prédio, nas ruas e nas redes sociais a população esteja acostumada com os novos traços da emissora, que chega para ser a cara e a voz do povo cearense, afinal, ela é feita para você.

Nós cresceremos juntos com essa região, com toda essa gente que abraça, que acolhe e que ama. Este é o Ser Tão Maior de Coração, onde a união prevalece acima de qualquer adversidade.

FragPipe — MSFragger made a bit more mortal, and a lot more powerful!

I’ve been meaning to do some tests with the new iterations of MSFragger and finally had a few minutes to do some extremely limited tests. More coming. 
Let’s clarify what we have here:
MSFragger is the search engine. It is impossibly ridiculously fast in command line format. If you had to remember one thing about MSFragger it’s that it’s the open search engine. Let it search for any delta mass shifts within 500 Da of your target. It’ll find those masses, and it will find them in seconds or minutes not hours, days, months, or millenia (some engines might honestly take more than a human lifespan to do a delta search, I don’t have the patience to verify). I also only used 500Da as an example. 
MSFragger nodes work directly in Proteome Discoverer and the newest ones even work with all the quan nodes! 
From what I wrote in that post, I’m trying to guess which computer I used for it…and I’m going to guess it was a 7th gen i7 laptop that I don’t use much these days. Last year TSA agents were forced to work without pay for 35 days or so and…well…airport security was a little more intense at times and my laptop got dropped and bent and getting it to go power on requires some careful flexing around the power button. I’m only rambling about it because it might be important in a minute. 
(clicking on this should expand and make this less blurry)
FragPipe is the package that contains MSFragger and all sorts of great stuff to make it useful. I’m using the Windows GUI version and it links directly into the Philosopher and it does all sorts of cool stuff like automatically downloading and formatting your FASTA files and directly reading from RAW files! 
I haven’t tried the DIA Umpire options yet (or many others, but I suspect I’ll get to them soon. This is WAY too cool not to spend time on). 
For those of us who get to one format of data input/output and get stuck on it a little — there are MSFragger Proteome Discoverer nodes now!  I’ve got a partially written post on here somewhere that I started a month ago and just realized I never finished….on the wrong continent and I don’t want Proteome Discoverer on the PC I am using right now. 
There are some steps involved in setting up FragPipe, but the tutorials are great, and the software gives you prompts. Got a Java that’s too new? Installed an old version of Philosopher a while back? It fixes all that stuff. 
To the mortal part — and the PC. When I first ran the command line MSFragger (I linked the post above) I spent a lot of time just hitting the enter button to see if it really was completing searches faster than I could click the enter button again. I’m not kidding, it is that fast. To the point that you assume it errored out and there could be no data — but there was data! However, a huge list of potential matches and OPEN SEARCH shifted masses was a little overwhelming. Obviously lots of power but, for me, some limited practicality compared to MetaMorpheus which provided a more practical output for me. (Is there some sort of Ann Arbor/Madison rivarly thing based around cricket or quidditch? I think there might be. Let’s extend it to free software!) 
FragPipe is doing a lot of stuff when I run it now. Way more stuff than MSFragger was doing. It’s taking my vendor format binary file (RAW) and it’s running with that. It’s making a file of my decoy reverse stuff. And — it’s running on an i3 I got off Amazon for <$300 and upgraded (SSD and RAM) to be slightly less of a catastrophe (go ahead and drop it TSA, everything is backed up on Cloud things!)  
Total run time per file? I’m hitting around 11 minutes with a 500 Da open search. A 500 Da open search!  That’s hella fast. It’s just not instantaneous. It could be the laptop I got for typing and battery life and NOT for data processing, but it could be that MSFragger is the fastest thing ever, but all those other new steps take time. I’ll check on other PCs later.
But — does it work? I wouldn’t be typing this at 3am if it didn’t! 
I didn’t tell Fragpipe any modifications at all and just Open Searched it. I kicked out the TSV file into Excel and used “Ideas Artificial Intelligence Button” (LOLs!) to make a Pivot table and chart. What do you know — loads of iodoacetamide and alkylations. (I think that’s a nice test for an open search engine. Find the stuff you know is there first!) 
FragPipe — Totally works — easy to use — still crazy fast — and now has all the tools around it to make MSFragger a complete package. 

Is Something Different this Time about the Effect of Technology on Labor Markets?

There’s a well-worn conversation about the relationship between new technology and possible job displacement which goes something like this:

Concerned person: “New developments in information technology and artificial intelligence are going to threaten lots of jobs.”

Skeptical person: “Economies in developed countries have been experiencing extraordinary developments and shifts in new technology for literally a couple of centuries. But as old jobs have been dislocated, new jobs have been created.”

Concerned person: “This time seems different.”

Skeptical person: “Every time is different in the specific details. But there’s certainly no downward pattern in the number of jobs in the last two centuries, or the last few decades.”

Concerned person: “Still, the way in which information technology and artificial intelligence replace workers seems different than the way in which, say, assembly lines replaced skilled artisan workers or combine harvesters replaced farm workers. “

Skeptical person: “Maybe this time will be different. After all, it’s logically impossible to prove that something in the future will NOT be different. But based on the long-run historical pattern, the evidence that new technology leads to shifts in the labor market is clear-cut, while the evidence that it leads to permanent job loss for the population as a whole is nonexistent.”

Concerned person: “Still, this current wave of technology seems different.”

Skeptical person: “I guess we’ll see how it unfolds in the next decade or two.”

The most recent Spring 2019 issue of the Journal of Economic Perspectives has a symposium on “Automation and Employment.” Two of the articles in particular offer a concrete arguments about how something is different with how the current new technologies are interacting with labor markets.

Daron Acemoglu and Pascual Restrepo discuss “Automation and New Tasks: How Technology Displaces and Reinstates Labor.” They suggest a framework in which automation can have three possible effects on the tasks that are involved in doing a job: a displacement effect, when automation replaces a task previously done by a worker; a productivity effect in which the higher productivity from automation taking over certain tasks leads to more buying power in the economy, creating jobs in other sectors; and a reinstatement effect, when new technology reshuffles the production process in a way that leads to new tasks that will be done by labor.

In this approach, the effect of automation on labor is not predestined to be good, bad, or neutral. It depends on how these three factors interact. Acemoglu and Restrepo attempt to calculate the size of these three factors for the US economy in two time periods: 1947-1987 and 1987-2017. There is of course considerable technological change through all of this 60-year period. For example, I’ve written  on this blog about “Automation and Job Loss: The Fears of 1964” (December 1, 2014)   and “Automation and Job Loss: Leontief in 1982” (August 22, 2016). But the later period can be associated more closely with the rise of computers and information technology.

Their calculations suggest that in the 1987-2017 period, the effects of automation have involved a larger displacement effect, lower productivity growth, and a lower reinstatement effect. The lower demand for labor can be seen in stagnant wage growth over this period for lower- and medium-skilled workers. They argue that the real issue isn’t whether automation displaces tasks and alters jobs–or course it does–but rather how those displacement effects compare to how automation leads to greater productivity the possibility of new job-related tasks that reinstate labor. They argue that public policy has some power to affect how the forward movement of technology will affect demand for labor: for example, they argue that public policy has tended to favor investment in new equipment and machinery over investment in human capital, like on-the-job training by employers.

Another angle on new technology and labor markets in the same issue of JEP comes from Jeremy Atack, Robert A. Margo, and Paul W. Rhode in “`Automation’ of Manufacturing in the Late Nineteenth Century: The Hand and Machine Labor Study.”  The focus of their paper is on a remarkably detailed US government study done in the 1890s of how machines were replacing the tasks involved in specific jobs.

The new assembly line machines of that time clearly displaced large number of tasks previously done by workers. However, the productivity effects of this wave of automation were very large. In addition, the new automation technology of that time had a powerful reinstatement effect of creating new tasks to be done by workers. They write:

[T]he net effect of the introduction of new tasks on labor demand appears to have been positive. This is because the share of time taken up by new tasks in machine labor was larger than the share of time associated with hand tasks that were abandoned�indeed, five times larger. Among other activities, these new tasks included maintenance of steam engines, a foreman supervising large numbers of workers (discussed further below), and workers packaging products for distant markets.

Atack, Margo, and Rhode also offer a broader point about technology and labor that seems to me worth considering. They point out that back in the 1890s, with a much heavier use of machines in the production process, there was a shift toward a broader division of labor: that is, the study counted more overall tasks to be done when machines were used, as compared to before the machines were used. One implication for workers of that time is that the path to a steady and well-paid job was to focus on a very particular niche of the production process. Indeed, one broad description of labor markets at this time is that there is a shift away from artisan workers (say, blacksmiths) who carried out many tasks, and toward workers who focused on a smaller set of tasks.

The authors suggest that one way in which modern technology is different from the 1890s is that it does not reward or encourage this kind of extreme division of labor. They write: 

The massive division of labor documented front and center in the Hand and Machine Labor study dramatically affected the nature of the human capital investment decision facing successive cohorts of American workers contemplating whether to enter the manufacturing sector. Earlier in the nineteenth century, the human capital investment problem such workers faced was mastering the diverse set of skills associated with most or all of the tasks involved in making a product, along with managing the affairs of a (very) small business, an artisan shop. The human capital investment problem facing the prospective manufacturing worker in the 1890s was quite different. There was little or no need to learn how to fashion a product from start to finish; mastery of one or two tasks
would do, and such mastery might be gained quickly on the job. The more able or ambitious might gravitate to learning new skills, such as designing, maintaining, or repairing steam engines, or clerical/managerial tasks, the demand for which had grown sharply as average establishment size increased over the century.

For many decades in the twentieth century, specialization was economically beneficial to workers�the costs of learning skills were relatively modest and the return on the investment�a relatively secure, highly paid job in manufacturing�made that investment worthwhile. The prospect of widespread automation has arguably changed this calculus. No single �job� is safe and the optimal investment strategy may be very different�a suite of diverse, relatively uncorrelated skills as insurance against displacement by robotics and artificial intelligence. This is perhaps the sense in which the history of how technology affects jobs is not repeating itself, and �this time� really is different.

In watching the cohort that includes my own children move from high school into young adulthood, this observation seems to me to contain a lot of truth. When it comes to training for a future job, many of us are still mentally in the 1890s, looking for one or a few particular focused skills that will guarantee a “good job.” But modern technologies are likely to disrupt what tasks are actually done in a very wide array of jobs, which will put a premium on workers with the ability to shift flexibly as the job situation is reshaped. 

Static Percolator allows application to smaller datasets!!

Okay — if we’ve talked at a meeting about data processing — we’ve talked about this. I’m at an amazing meeting right now and I was in 2 great conversations about this concept already.

Percolator is fantastic. It is the gold standard for false discovery rate calculations, but it was designed for global applications. If you’ve looked at your data you’ve seen this phenomena where all the sudden you can’t seem to trust what it is giving you by default.

Some of those videos over there on Proteome Discoverer are like 7 years old now? And I ramble incoherently about it there. But what is the solution?  Could it be this!?!?

I have a finite number of stored Obama Boom GIFs left, but this deserves one.

Static modeling percolator!!

What’s the difference? Normal percolator is dynamic. It learns from that big ‘ol dataset you just gave it and that’s what it uses to set your parameters.

Static modeling flips the switch. What if it learns from a big ‘ol dataset and takes all that stuff it just learned and you apply those settings to the little dataset you just gave it?

Well — it looks like you’ve got a smart Percolator right out of the box!

As shown in the picture at the very top that I stole and then clipfarted over — when your datasets are too small for Percolator to learn enough from it gets “discordant” (their word). I like fuzzy better. I’ve always wondered if there was a static cutoff. It’s great at 100,000 PSMs. It can be BAD at 1,000 PSMs. Where is the cutoff?

What we learn here is that there is no set number (which makes sense, it would be weird it there was) but you get progressively fuzzier as PSM numbers drop (which we’ve all seen. we’re all totally smart. just not smart enough to fix it). These guys (one of them, I hear, has some evidence he knows something about how the whole thing works) just pulled the whole thing together.

100% Recommended reading (or at least skimming). It has big implications for our field and how we will process ALL our data in the future.

Edit: Want to know more? Check out this blog post!

Bombeiros controlam incêndio em residência em Quixeramobim

O Corpo de Bombeiros conseguiu controlar um incêndio em uma residência na Rua Teófilo Lessa, 612, no Bairro Monteiro de Moraes, na tarde deste sábado, 23.

Conforme relato da equipe dos Bombeiros, o fogo ainda chegou a consumir um quarto, onde estavam alguns móveis, como: cama, armário, e bolsas com roupas. 
Danos maiores foram evitados após atuação dos Bombeiros. Não houve registros de pessoas feridas no local.
Design a site like this with WordPress.com
Get started